Ibrahim Hazem, AlDahoul Nouar, Lee Sangjin, Rahwan Talal, Zaki Yasir
Department of Computer Science, New York University Abu Dhabi, Abu Dhabi 129188, UAE.
PNAS Nexus. 2023 Aug 14;2(8):pgad264. doi: 10.1093/pnasnexus/pgad264. eCollection 2023 Aug.
With over two billion monthly active users, YouTube currently shapes the landscape of online political video consumption, with 25% of adults in the United States regularly consuming political content via the platform. Considering that nearly three-quarters of the videos watched on YouTube are delivered via its recommendation algorithm, the propensity of this algorithm to create echo chambers and deliver extremist content has been an active area of research. However, it is unclear whether the algorithm may exhibit political leanings toward either the Left or Right. To fill this gap, we constructed archetypal users across six personas in the US political context, ranging from Far Left to Far Right. Utilizing these users, we performed a controlled experiment in which they consumed over eight months worth of videos and were recommended over 120,000 unique videos. We find that while the algorithm pulls users away from political extremes, this pull is asymmetric, with users being pulled away from Far Right content stronger than from Far Left. Furthermore, we show that the recommendations made by the algorithm skew left even when the user does not have a watch history. Our results raise questions on whether the recommendation algorithms of social media platforms in general, and YouTube, in particular, should exhibit political biases, and the wide-reaching societal and political implications that such biases could entail.
YouTube目前拥有超过20亿的月活跃用户,塑造了在线政治视频消费的格局,美国25%的成年人经常通过该平台消费政治内容。鉴于在YouTube上观看的视频近四分之三是通过其推荐算法推送的,该算法制造信息茧房并推送极端主义内容的倾向一直是一个活跃的研究领域。然而,尚不清楚该算法是否可能表现出对左派或右派的政治倾向。为了填补这一空白,我们在美国政治背景下构建了六个角色的典型用户,从极左派到极右派。利用这些用户,我们进行了一项对照实验,在实验中他们观看了超过八个月的视频,并被推荐了超过12万条独特的视频。我们发现,虽然算法会将用户从政治极端内容中拉离,但这种拉力是不对称的,用户被拉离极右翼内容的力度比极左翼内容更强。此外,我们表明,即使用户没有观看历史,算法给出的推荐也会向左倾斜。我们的研究结果引发了关于社交媒体平台(尤其是YouTube)的推荐算法是否应该表现出政治偏见,以及这种偏见可能带来的广泛社会和政治影响的问题。