• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

短期接触过滤气泡推荐系统的极化效应有限:YouTube上的自然实验。

Short-term exposure to filter-bubble recommendation systems has limited polarization effects: Naturalistic experiments on YouTube.

作者信息

Liu Naijia, Hu Xinlan Emily, Savas Yasemin, Baum Matthew A, Berinsky Adam J, Chaney Allison J B, Lucas Christopher, Mariman Rei, de Benedictis-Kessner Justin, Guess Andrew M, Knox Dean, Stewart Brandon M

机构信息

Department of Government, Harvard University, Cambridge, MA 02138.

Operations, Information, Decisions Department, the Wharton School, University of Pennsylvania, Philadelphia, PA 19104.

出版信息

Proc Natl Acad Sci U S A. 2025 Feb 25;122(8):e2318127122. doi: 10.1073/pnas.2318127122. Epub 2025 Feb 18.

DOI:10.1073/pnas.2318127122
PMID:39964709
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11874454/
Abstract

An enormous body of literature argues that recommendation algorithms drive political polarization by creating "filter bubbles" and "rabbit holes." Using four experiments with nearly 9,000 participants, we show that manipulating algorithmic recommendations to create these conditions has limited effects on opinions. Our experiments employ a custom-built video platform with a naturalistic, YouTube-like interface presenting real YouTube videos and recommendations. We experimentally manipulate YouTube's actual recommendation algorithm to simulate filter bubbles and rabbit holes by presenting ideologically balanced and slanted choices. Our design allows us to intervene in a feedback loop that has confounded the study of algorithmic polarization-the complex interplay between supply of recommendations and user demand for content-to examine downstream effects on policy attitudes. We use over 130,000 experimentally manipulated recommendations and 31,000 platform interactions to estimate how recommendation algorithms alter users' media consumption decisions and, indirectly, their political attitudes. Our results cast doubt on widely circulating theories of algorithmic polarization by showing that even heavy-handed (although short-term) perturbations of real-world recommendations have limited causal effects on policy attitudes. Given our inability to detect consistent evidence for algorithmic effects, we argue the burden of proof for claims about algorithm-induced polarization has shifted. Our methodology, which captures and modifies the output of real-world recommendation algorithms, offers a path forward for future investigations of black-box artificial intelligence systems. Our findings reveal practical limits to effect sizes that are feasibly detectable in academic experiments.

摘要

大量文献认为,推荐算法通过制造“过滤气泡”和“信息茧房”导致政治两极分化。通过对近9000名参与者进行的四项实验,我们发现,操纵算法推荐以营造这些条件对观点的影响有限。我们的实验采用了一个定制的视频平台,其界面类似YouTube,呈现真实的YouTube视频和推荐内容。我们通过展示意识形态平衡和有偏向性的选择,对YouTube的实际推荐算法进行实验性操纵,以模拟过滤气泡和信息茧房。我们的设计使我们能够干预一个混淆了算法两极分化研究的反馈循环——推荐供应与用户对内容的需求之间的复杂相互作用——以检验对政策态度的下游影响。我们使用超过130000条经实验操纵的推荐和31000次平台交互,来估计推荐算法如何改变用户的媒体消费决策,以及间接改变他们的政治态度。我们的结果对广泛流传的算法两极分化理论提出了质疑,表明即使是对现实世界推荐进行的严厉(尽管是短期)干扰,对政策态度的因果影响也有限。鉴于我们无法找到算法效应的一致证据,我们认为关于算法导致两极分化的主张的举证责任已经转移。我们捕捉并修改现实世界推荐算法输出的方法,为未来对黑箱人工智能系统的研究提供了一条前进的道路。我们的研究结果揭示了在学术实验中可切实检测到的效应大小的实际限制。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/3b610998da50/pnas.2318127122fig09.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/9e7ea02d8dc5/pnas.2318127122fig01.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/84801405c03f/pnas.2318127122fig02.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/5e2a16ee5cbb/pnas.2318127122fig03.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/c8ede32bf49b/pnas.2318127122fig04.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/6dc5103a82e1/pnas.2318127122fig05.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/9fae4c3d812d/pnas.2318127122fig06.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/4dcbc4825d47/pnas.2318127122fig07.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/98bf5971f768/pnas.2318127122fig08.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/3b610998da50/pnas.2318127122fig09.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/9e7ea02d8dc5/pnas.2318127122fig01.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/84801405c03f/pnas.2318127122fig02.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/5e2a16ee5cbb/pnas.2318127122fig03.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/c8ede32bf49b/pnas.2318127122fig04.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/6dc5103a82e1/pnas.2318127122fig05.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/9fae4c3d812d/pnas.2318127122fig06.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/4dcbc4825d47/pnas.2318127122fig07.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/98bf5971f768/pnas.2318127122fig08.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c1/11874454/3b610998da50/pnas.2318127122fig09.jpg

相似文献

1
Short-term exposure to filter-bubble recommendation systems has limited polarization effects: Naturalistic experiments on YouTube.短期接触过滤气泡推荐系统的极化效应有限:YouTube上的自然实验。
Proc Natl Acad Sci U S A. 2025 Feb 25;122(8):e2318127122. doi: 10.1073/pnas.2318127122. Epub 2025 Feb 18.
2
Exploring YouTube's Recommendation System in the Context of COVID-19 Vaccines: Computational and Comparative Analysis of Video Trajectories.探索 COVID-19 疫苗背景下的 YouTube 推荐系统:视频轨迹的计算与比较分析。
J Med Internet Res. 2023 Sep 15;25:e49061. doi: 10.2196/49061.
3
Nudging recommendation algorithms increases news consumption and diversity on YouTube.微调推荐算法可增加YouTube上的新闻消费量和多样性。
PNAS Nexus. 2024 Nov 19;3(12):pgae518. doi: 10.1093/pnasnexus/pgae518. eCollection 2024 Dec.
4
Auditing YouTube's recommendation system for ideologically congenial, extreme, and problematic recommendations.审核YouTube的推荐系统,查看是否存在意识形态上趋同、极端和有问题的推荐内容。
Proc Natl Acad Sci U S A. 2023 Dec 12;120(50):e2213020120. doi: 10.1073/pnas.2213020120. Epub 2023 Dec 5.
5
YouTube's recommendation algorithm is left-leaning in the United States.YouTube的推荐算法在美国倾向于左倾。
PNAS Nexus. 2023 Aug 14;2(8):pgad264. doi: 10.1093/pnasnexus/pgad264. eCollection 2023 Aug.
6
Examining algorithmic biases in YouTube's recommendations of vaccine videos.分析 YouTube 推荐疫苗视频的算法偏见
Int J Med Inform. 2020 Aug;140:104175. doi: 10.1016/j.ijmedinf.2020.104175. Epub 2020 May 19.
7
Causally estimating the effect of YouTube's recommender system using counterfactual bots.使用反事实机器人对YouTube推荐系统的效果进行因果估计。
Proc Natl Acad Sci U S A. 2024 Feb 20;121(8):e2313377121. doi: 10.1073/pnas.2313377121. Epub 2024 Feb 13.
8
Tubes and bubbles topological confinement of YouTube recommendations.管和泡:YouTube 推荐的拓扑约束。
PLoS One. 2020 Apr 21;15(4):e0231703. doi: 10.1371/journal.pone.0231703. eCollection 2020.
9
Influence of User Profile Attributes on e-Cigarette-Related Searches on YouTube: Machine Learning Clustering and Classification.用户资料属性对YouTube上与电子烟相关搜索的影响:机器学习聚类与分类
JMIR Infodemiology. 2023 Apr 12;3:e42218. doi: 10.2196/42218. eCollection 2023.
10
Should I Trust the Artificial Intelligence to Recruit? Recruiters' Perceptions and Behavior When Faced With Algorithm-Based Recommendation Systems During Resume Screening.我应该信任人工智能进行招聘吗?招聘人员在简历筛选过程中面对基于算法的推荐系统时的认知与行为。
Front Psychol. 2022 Jul 6;13:895997. doi: 10.3389/fpsyg.2022.895997. eCollection 2022.

本文引用的文献

1
Nudging recommendation algorithms increases news consumption and diversity on YouTube.微调推荐算法可增加YouTube上的新闻消费量和多样性。
PNAS Nexus. 2024 Nov 19;3(12):pgae518. doi: 10.1093/pnasnexus/pgae518. eCollection 2024 Dec.
2
Causally estimating the effect of YouTube's recommender system using counterfactual bots.使用反事实机器人对YouTube推荐系统的效果进行因果估计。
Proc Natl Acad Sci U S A. 2024 Feb 20;121(8):e2313377121. doi: 10.1073/pnas.2313377121. Epub 2024 Feb 13.
3
Auditing YouTube's recommendation system for ideologically congenial, extreme, and problematic recommendations.
审核YouTube的推荐系统,查看是否存在意识形态上趋同、极端和有问题的推荐内容。
Proc Natl Acad Sci U S A. 2023 Dec 12;120(50):e2213020120. doi: 10.1073/pnas.2213020120. Epub 2023 Dec 5.
4
Social media, extremism, and radicalization.社交媒体、极端主义与激进化。
Sci Adv. 2023 Sep;9(35):eadk2031. doi: 10.1126/sciadv.adk2031. Epub 2023 Aug 30.
5
Subscriptions and external links help drive resentful users to alternative and extremist YouTube channels.用户订阅和外部链接会促使心怀不满的用户转向其他的 YouTube 频道和极端主义频道。
Sci Adv. 2023 Sep;9(35):eadd8080. doi: 10.1126/sciadv.add8080. Epub 2023 Aug 30.
6
Like-minded sources on Facebook are prevalent but not polarizing.脸书上志同道合的消息源很常见,但不会造成两极分化。
Nature. 2023 Aug;620(7972):137-144. doi: 10.1038/s41586-023-06297-w. Epub 2023 Jul 27.
7
How do social media feed algorithms affect attitudes and behavior in an election campaign?社交媒体的推荐算法如何影响选举活动中的态度和行为?
Science. 2023 Jul 28;381(6656):398-404. doi: 10.1126/science.abp9364. Epub 2023 Jul 27.
8
Quantifying the potential persuasive returns to political microtargeting.量化政治微目标定位的潜在说服力回报。
Proc Natl Acad Sci U S A. 2023 Jun 20;120(25):e2216261120. doi: 10.1073/pnas.2216261120. Epub 2023 Jun 12.
9
Users choose to engage with more partisan news than they are exposed to on Google Search.用户选择与更多带有党派偏见的新闻互动,而不是在谷歌搜索中接触到的新闻。
Nature. 2023 Jun;618(7964):342-348. doi: 10.1038/s41586-023-06078-5. Epub 2023 May 24.
10
Hypotheses on a tree: new error rates and testing strategies.树上的假设:新的错误率和检验策略。
Biometrika. 2021 Sep;108(3):575-590. doi: 10.1093/biomet/asaa086. Epub 2020 Oct 14.