The China Mail - Facebook's algorithm doesn't alter people's beliefs: research

USD -
AED 3.672501
AFN 63.000275
ALL 82.697811
AMD 377.229941
ANG 1.790083
AOA 916.999848
ARS 1391.828097
AUD 1.443545
AWG 1.8025
AZN 1.701068
BAM 1.685671
BBD 2.013678
BDT 122.977207
BGN 1.709309
BHD 0.377518
BIF 2965
BMD 1
BND 1.28264
BOB 6.908351
BRL 5.154994
BSD 0.999815
BTN 92.79256
BWP 13.597831
BYN 2.973319
BYR 19600
BZD 2.010774
CAD 1.387495
CDF 2295.000278
CHF 0.79374
CLF 0.023121
CLP 912.959992
CNY 6.872032
CNH 6.876455
COP 3673.42
CRC 464.839659
CUC 1
CUP 26.5
CVE 95.501128
CZK 21.147006
DJF 177.720133
DKK 6.445503
DOP 60.498182
DZD 132.786355
EGP 53.516702
ERN 15
ETB 157.000501
EUR 0.862499
FJD 2.253801
FKP 0.758501
GBP 0.751285
GEL 2.690026
GGP 0.758501
GHS 10.999694
GIP 0.758501
GMD 73.500677
GNF 8779.999839
GTQ 7.648319
GYD 209.250209
HKD 7.83755
HNL 26.620289
HRK 6.500499
HTG 131.237691
HUF 330.560504
IDR 16937
ILS 3.13645
IMP 0.758501
INR 92.64295
IQD 1309.5
IRR 1318875.000028
ISK 124.5498
JEP 0.758501
JMD 158.120413
JOD 0.708971
JPY 158.726981
KES 130.050003
KGS 87.449658
KHR 4010.50148
KMF 426.749751
KPW 899.943346
KRW 1513.249796
KWD 0.30946
KYD 0.833229
KZT 475.292069
LAK 21952.505413
LBP 89195.600604
LKR 315.172096
LRD 183.849818
LSL 16.944964
LTL 2.95274
LVL 0.60489
LYD 6.374968
MAD 9.325007
MDL 17.611846
MGA 4175.000008
MKD 53.184193
MMK 2100.405998
MNT 3572.722217
MOP 8.072575
MRU 40.129569
MUR 46.78984
MVR 15.449535
MWK 1736.999767
MXN 17.82435
MYR 4.020498
MZN 63.960387
NAD 16.944979
NGN 1380.03048
NIO 36.709931
NOK 9.71384
NPR 148.468563
NZD 1.739025
OMR 0.384493
PAB 0.999836
PEN 3.47801
PGK 4.358966
PHP 60.180014
PKR 279.201607
PLN 3.694545
PYG 6493.344193
QAR 3.644504
RON 4.397298
RSD 101.201993
RUB 80.300679
RWF 1461
SAR 3.753461
SBD 8.009975
SCR 14.03822
SDG 601.000186
SEK 9.41201
SGD 1.282745
SHP 0.750259
SLE 24.609359
SLL 20969.510825
SOS 571.497886
SRD 37.363999
STD 20697.981008
STN 21.5
SVC 8.748077
SYP 110.747305
SZL 16.93499
THB 32.602324
TJS 9.560589
TMT 3.5
TND 2.91425
TOP 2.40776
TRY 44.491695
TTD 6.785987
TWD 32.016996
TZS 2589.999963
UAH 43.749677
UGX 3724.309718
UYU 40.637618
UZS 12199.999993
VES 473.325203
VND 26335
VUV 120.24399
WST 2.777713
XAF 565.390002
XAG 0.013235
XAU 0.000209
XCD 2.70255
XCG 1.801759
XDR 0.710952
XOF 564.498872
XPF 103.303045
YER 238.624981
ZAR 16.809899
ZMK 9001.197909
ZMW 19.270981
ZWL 321.999592
  • RBGPF

    -13.5000

    69

    -19.57%

  • BCC

    -0.7700

    75.08

    -1.03%

  • NGG

    2.2400

    86.84

    +2.58%

  • BCE

    0.1400

    25.38

    +0.55%

  • BTI

    -0.5800

    57.89

    -1%

  • RIO

    1.5200

    94.81

    +1.6%

  • CMSC

    0.0900

    21.99

    +0.41%

  • RELX

    0.0800

    33.23

    +0.24%

  • BP

    -0.8300

    46.17

    -1.8%

  • AZN

    3.5100

    200.73

    +1.75%

  • JRI

    0.2200

    12.52

    +1.76%

  • GSK

    0.8000

    55.99

    +1.43%

  • CMSD

    0.0500

    22.15

    +0.23%

  • RYCEF

    0.9500

    16

    +5.94%

  • VOD

    0.1100

    15.13

    +0.73%

Facebook's algorithm doesn't alter people's beliefs: research
Facebook's algorithm doesn't alter people's beliefs: research / Photo: © AFP/File

Facebook's algorithm doesn't alter people's beliefs: research

Do social media echo chambers deepen political polarization, or simply reflect existing social divisions?

Text size:

A landmark research project that investigated Facebook around the 2020 US presidential election published its first results Thursday, finding that, contrary to assumption, the platform's often criticized content-ranking algorithm doesn't shape users' beliefs.

The work is the product of a collaboration between Meta -- the parent company of Facebook and Instagram -- and a group of academics from US universities who were given broad access to internal company data, and signed up tens of thousands of users for experiments.

The academic team wrote four papers examining the role of the social media giant in American democracy, which were published in the scientific journals Science and Nature.

Overall, the algorithm was found to be "extremely influential in people's on-platform experiences," said project leaders Talia Stroud of the University of Texas at Austin and Joshua Tucker, of New York University.

In other words, it heavily impacted what the users saw, and how much they used the platforms.

"But we also know that changing the algorithm for even a few months isn't likely to change people's political attitudes," they said, as measured by users' answers on surveys after they took part in three-month-long experiments that altered how they received content.

The authors acknowledged this conclusion might be because the changes weren't in place for long enough to make an impact, given that the United States has been growing more polarized for decades.

Nevertheless, "these findings challenge popular narratives blaming social media echo chambers for the problems of contemporary American democracy," wrote the authors of one of the papers, published in Nature.

- 'No silver bullet' -

Facebook's algorithm, which uses machine-learning to decide which posts rise to the top of users' feeds based on their interests, has been accused of giving rise to "filter bubbles" and enabling the spread of misinformation.

Researchers recruited around 40,000 volunteers via invitations placed on their Facebook and Instagram feeds, and designed an experiment where one group was exposed to the normal algorithm, while the other saw posts listed from newest to oldest.

Facebook originally used a reverse chronological system and some observers have suggested that switching back to it will reduce social media's harmful effects.

The team found that users in the chronological feed group spent around half the amount of time on Facebook and Instagram compared to the algorithm group.

On Facebook, those in the chronological group saw more content from moderate friends, as well as more sources with ideologically mixed audiences.

But the chronological feed also increased the amount of political and untrustworthy content seen by users.

Despite the differences, the changes did not cause detectable changes in measured political attitudes.

"The findings suggest that chronological feed is no silver bullet for issues such as political polarization," said coauthor Jennifer Pan of Stanford.

- Meta welcomes findings -

In a second paper in Science, the same team researched the impact of reshared content, which constitutes more than a quarter of content that Facebook users see.

Suppressing reshares has been suggested as a means to control harmful viral content.

The team ran a controlled experiment in which a group of Facebook users saw no changes to their feeds, while another group had reshared content removed.

Removing reshares reduced the proportion of political content seen, resulting in reduced political knowledge -- but again did not impact downstream political attitudes or behaviors.

A third paper, in Nature, probed the impact of content from "like-minded" users, pages, and groups in their feeds, which the researchers found constituted a majority of what the entire population of active adult Facebook users see in the US.

But in an experiment involving over 23,000 Facebook users, suppressing like-minded content once more had no impact on ideological extremity or belief in false claims.

A fourth paper, in Science, did however confirm extreme "ideological segregation" on Facebook, with politically conservative users more siloed in their news sources than liberals.

What's more, 97 percent of political news URLs on Facebook rated as false by Meta's third-party fact checking program -- which AFP is part of -- were seen by more conservatives than liberals.

Meta welcomed the overall findings.

They "add to a growing body of research showing there is little evidence that social media causes harmful... polarization or has any meaningful impact on key political attitudes, beliefs or behaviors," said Nick Clegg, the company's president of global affairs.

C.Mak--ThChM