UN Global Pulse, New York, NY, United States.
Department of Journalism & Institute of Communications Research, University of Illinois at Urbana-Champaign, Champaign, IL, United States.
J Med Internet Res. 2023 Sep 15;25:e49061. doi: 10.2196/49061.
Throughout the COVID-19 pandemic, there has been a concern that social media may contribute to vaccine hesitancy due to the wide availability of antivaccine content on social media platforms. YouTube has stated its commitment to removing content that contains misinformation on vaccination. Nevertheless, such claims are difficult to audit. There is a need for more empirical research to evaluate the actual prevalence of antivaccine sentiment on the internet.
This study examines recommendations made by YouTube's algorithms in order to investigate whether the platform may facilitate the spread of antivaccine sentiment on the internet. We assess the prevalence of antivaccine sentiment in recommended videos and evaluate how real-world users' experiences are different from the personalized recommendations obtained by using synthetic data collection methods, which are often used to study YouTube's recommendation systems.
We trace trajectories from a credible seed video posted by the World Health Organization to antivaccine videos, following only video links suggested by YouTube's recommendation system. First, we gamify the process by asking real-world participants to intentionally find an antivaccine video with as few clicks as possible. Having collected crowdsourced trajectory data from respondents from (1) the World Health Organization and United Nations system (n=33) and (2) Amazon Mechanical Turk (n=80), we next compare the recommendations seen by these users to recommended videos that are obtained from (3) the YouTube application programming interface's RelatedToVideoID parameter (n=40) and (4) from clean browsers without any identifying cookies (n=40), which serve as reference points. We develop machine learning methods to classify antivaccine content at scale, enabling us to automatically evaluate 27,074 video recommendations made by YouTube.
We found no evidence that YouTube promotes antivaccine content; the average share of antivaccine videos remained well below 6% at all steps in users' recommendation trajectories. However, the watch histories of users significantly affect video recommendations, suggesting that data from the application programming interface or from a clean browser do not offer an accurate picture of the recommendations that real users are seeing. Real users saw slightly more provaccine content as they advanced through their recommendation trajectories, whereas synthetic users were drawn toward irrelevant recommendations as they advanced. Rather than antivaccine content, videos recommended by YouTube are likely to contain health-related content that is not specifically related to vaccination. These videos are usually longer and contain more popular content.
Our findings suggest that the common perception that YouTube's recommendation system acts as a "rabbit hole" may be inaccurate and that YouTube may instead be following a "blockbuster" strategy that attempts to engage users by promoting other content that has been reliably successful across the platform.
在整个 COVID-19 大流行期间,人们一直担心社交媒体可能会因为社交媒体平台上广泛存在的反疫苗内容而导致疫苗犹豫。YouTube 已表示将致力于删除包含疫苗接种错误信息的内容。然而,此类说法很难审核。需要更多的实证研究来评估互联网上反疫苗情绪的实际流行程度。
本研究考察了 YouTube 算法提出的建议,以调查该平台是否可能促进互联网上反疫苗情绪的传播。我们评估了推荐视频中反疫苗情绪的流行程度,并评估了真实用户的体验与使用合成数据收集方法获得的个性化推荐有何不同,这些方法通常用于研究 YouTube 的推荐系统。
我们从世界卫生组织发布的可信种子视频追踪到反疫苗视频,仅遵循 YouTube 推荐系统建议的视频链接。首先,我们通过让真实参与者尽可能少点击地故意找到反疫苗视频来使过程游戏化。从(1)世界卫生组织和联合国系统的受访者(n=33)和(2)亚马逊 Mechanical Turk(n=80)收集了众包轨迹数据后,我们将这些用户看到的推荐与通过(3)YouTube 应用程序编程接口的 RelatedToVideoID 参数获得的推荐视频和(4)来自没有任何识别 cookie 的干净浏览器的推荐视频(n=40)进行比较,作为参考点。我们开发了机器学习方法来大规模分类反疫苗内容,使我们能够自动评估 YouTube 推荐的 27,074 个视频。
我们没有发现 YouTube 推广反疫苗内容的证据;在用户推荐轨迹的所有步骤中,反疫苗视频的平均份额仍然远低于 6%。然而,用户的观看历史会显著影响视频推荐,这表明应用程序编程接口或干净浏览器的数据并不能准确反映真实用户正在看到的推荐。真实用户在其推荐轨迹中前进时会看到稍微更多的支持疫苗的内容,而合成用户在前进时会被吸引到不相关的推荐。被 YouTube 推荐的视频可能包含与健康相关的内容,而不是专门与疫苗接种相关的内容。这些视频通常更长,并且包含更多受欢迎的内容。
我们的研究结果表明,人们普遍认为 YouTube 的推荐系统充当“兔子洞”的观点可能不准确,而 YouTube 可能会采用“爆款”策略,试图通过推广在平台上经过可靠验证的其他内容来吸引用户。