Suppr超能文献

YouTube的推荐算法在美国倾向于左倾。

YouTube's recommendation algorithm is left-leaning in the United States.

作者信息

Ibrahim Hazem, AlDahoul Nouar, Lee Sangjin, Rahwan Talal, Zaki Yasir

机构信息

Department of Computer Science, New York University Abu Dhabi, Abu Dhabi 129188, UAE.

出版信息

PNAS Nexus. 2023 Aug 14;2(8):pgad264. doi: 10.1093/pnasnexus/pgad264. eCollection 2023 Aug.

Abstract

With over two billion monthly active users, YouTube currently shapes the landscape of online political video consumption, with 25% of adults in the United States regularly consuming political content via the platform. Considering that nearly three-quarters of the videos watched on YouTube are delivered via its recommendation algorithm, the propensity of this algorithm to create echo chambers and deliver extremist content has been an active area of research. However, it is unclear whether the algorithm may exhibit political leanings toward either the Left or Right. To fill this gap, we constructed archetypal users across six personas in the US political context, ranging from Far Left to Far Right. Utilizing these users, we performed a controlled experiment in which they consumed over eight months worth of videos and were recommended over 120,000 unique videos. We find that while the algorithm pulls users away from political extremes, this pull is asymmetric, with users being pulled away from Far Right content stronger than from Far Left. Furthermore, we show that the recommendations made by the algorithm skew left even when the user does not have a watch history. Our results raise questions on whether the recommendation algorithms of social media platforms in general, and YouTube, in particular, should exhibit political biases, and the wide-reaching societal and political implications that such biases could entail.

摘要

YouTube目前拥有超过20亿的月活跃用户,塑造了在线政治视频消费的格局,美国25%的成年人经常通过该平台消费政治内容。鉴于在YouTube上观看的视频近四分之三是通过其推荐算法推送的,该算法制造信息茧房并推送极端主义内容的倾向一直是一个活跃的研究领域。然而,尚不清楚该算法是否可能表现出对左派或右派的政治倾向。为了填补这一空白,我们在美国政治背景下构建了六个角色的典型用户,从极左派到极右派。利用这些用户,我们进行了一项对照实验,在实验中他们观看了超过八个月的视频,并被推荐了超过12万条独特的视频。我们发现,虽然算法会将用户从政治极端内容中拉离,但这种拉力是不对称的,用户被拉离极右翼内容的力度比极左翼内容更强。此外,我们表明,即使用户没有观看历史,算法给出的推荐也会向左倾斜。我们的研究结果引发了关于社交媒体平台(尤其是YouTube)的推荐算法是否应该表现出政治偏见,以及这种偏见可能带来的广泛社会和政治影响的问题。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/39ad/10433241/fa2f89949d5a/pgad264f1.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验