Oxford Internet Institute, University of Oxford, Oxford, UK.
Management Department, University of Exeter Business School, Exeter, UK.
Nature. 2024 Oct;634(8034):609-616. doi: 10.1038/s41586-024-07942-8. Epub 2024 Oct 2.
In response to intense pressure, technology companies have enacted policies to combat misinformation. The enforcement of these policies has, however, led to technology companies being regularly accused of political bias. We argue that differential sharing of misinformation by people identifying with different political groups could lead to political asymmetries in enforcement, even by unbiased policies. We first analysed 9,000 politically active Twitter users during the US 2020 presidential election. Although users estimated to be pro-Trump/conservative were indeed substantially more likely to be suspended than those estimated to be pro-Biden/liberal, users who were pro-Trump/conservative also shared far more links to various sets of low-quality news sites-even when news quality was determined by politically balanced groups of laypeople, or groups of only Republican laypeople-and had higher estimated likelihoods of being bots. We find similar associations between stated or inferred conservatism and low-quality news sharing (on the basis of both expert and politically balanced layperson ratings) in 7 other datasets of sharing from Twitter, Facebook and survey experiments, spanning 2016 to 2023 and including data from 16 different countries. Thus, even under politically neutral anti-misinformation policies, political asymmetries in enforcement should be expected. Political imbalance in enforcement need not imply bias on the part of social media companies implementing anti-misinformation policies.
面对巨大的压力,科技公司已经制定了打击错误信息的政策。然而,这些政策的执行却导致科技公司经常被指责存在政治偏见。我们认为,不同政治群体认同的人分享错误信息的差异可能导致执法中的政治不对称,即使是在没有偏见的政策下也是如此。我们首先分析了美国 2020 年总统选举期间的 9000 名政治活跃的 Twitter 用户。尽管被估计为支持特朗普/保守派的用户确实比估计为支持拜登/自由派的用户更有可能被暂停,但支持特朗普/保守派的用户也分享了更多指向各种低质量新闻网站的链接——即使新闻质量是由政治平衡的普通人或只有共和党普通人组成的群体来确定的——并且被估计更有可能是机器人。我们在来自 Twitter、Facebook 和调查实验的另外 7 个分享数据集(包括 16 个不同国家的数据)中发现了与保守主义(基于专家和政治平衡的普通人评级)之间的类似关联,这些数据涵盖了 2016 年至 2023 年的信息。因此,即使在政治中立的反错误信息政策下,执法中的政治不对称也应该是可以预期的。执法中的政治不平衡并不一定意味着实施反错误信息政策的社交媒体公司存在偏见。