Faculty of Communication, Culture and Society, Università della Svizzera italiana, Lugano, Switzerland.
Department of Communication and Media, Ewha Womans University, Seoul, Republic of Korea.
J Med Internet Res. 2023 Sep 18;25:e44656. doi: 10.2196/44656.
Mental health problems are recognized as a pressing public health issue, and an increasing number of individuals are turning to online communities for mental health to search for information and support. Although these virtual platforms have the potential to provide emotional support and access to anecdotal experiences, they can also present users with large amounts of potentially inaccurate information. Despite the importance of this issue, limited research has been conducted, especially on the differences that might emerge due to the type of content moderation of online communities: peer-led or expert-led.
We aim to fill this gap by examining the prevalence, the communicative context, and the persistence of mental health misinformation on Facebook online communities for mental health, with a focus on understanding the mechanisms that enable effective correction of inaccurate information and differences between expert-led and peer-led groups.
We conducted a content analysis of 1534 statements (from 144 threads) in 2 Italian-speaking Facebook groups.
The study found that an alarming number of comments (26.1%) contained medically inaccurate information. Furthermore, nearly 60% of the threads presented at least one misinformation statement without any correction attempt. Moderators were more likely to correct misinformation than members; however, they were not immune to posting content containing misinformation, which was an unexpected finding. Discussions about aspects of treatment (including side effects or treatment interruption) significantly increased the probability of encountering misinformation. Additionally, the study found that misinformation produced in the comments of a thread, rather than as the first post, had a lower probability of being corrected, particularly in peer-led communities.
The high prevalence of misinformation in online communities, particularly when left uncorrected, underscores the importance of conducting additional research to identify effective mechanisms to prevent its spread. This is especially important given the study's finding that misinformation tends to be more prevalent around specific "loci" of discussion that, once identified, can serve as a starting point to develop strategies for preventing and correcting misinformation within them.
心理健康问题已被公认为一个紧迫的公共卫生问题,越来越多的人开始在在线社区寻求心理健康方面的信息和支持。虽然这些虚拟平台有可能提供情感支持和获得轶事经验,但它们也可能向用户提供大量潜在不准确的信息。尽管这个问题很重要,但研究有限,特别是对于由于在线社区的内容审核类型(同行主导或专家主导)可能出现的差异方面。
我们旨在通过研究 Facebook 在线心理健康社区中心理健康错误信息的流行率、传播语境和持续存在,来填补这一空白,重点是理解能够有效纠正不准确信息的机制以及专家主导和同行主导群体之间的差异。
我们对 144 个主题中的 1534 个陈述(来自 144 个线程)进行了内容分析。
研究发现,令人震惊的是,大量评论(26.1%)包含医学上不准确的信息。此外,近 60%的线程至少出现了一个未经纠正的错误信息陈述。与成员相比,版主更有可能纠正错误信息;然而,他们也无法避免发布包含错误信息的内容,这是一个意外的发现。关于治疗方面的讨论(包括副作用或治疗中断)显著增加了遇到错误信息的可能性。此外,研究还发现,与作为首帖发布的错误信息相比,在主题评论中发布的错误信息被纠正的可能性较低,特别是在同行主导的社区中。
在线社区中错误信息的高流行率,特别是当未被纠正时,突显了开展额外研究以确定防止其传播的有效机制的重要性。鉴于研究发现错误信息往往更普遍存在于特定的讨论“地点”,一旦确定,这些地点可以作为起点,制定策略以防止和纠正这些地点内的错误信息,这一点尤为重要。