Suppr超能文献

探索社交媒体平台上的心理健康内容审核与幸福工具:流程分析

Exploring Mental Health Content Moderation and Well-Being Tools on Social Media Platforms: Walkthrough Analysis.

作者信息

Haime Zoë, Biddle Lucy

机构信息

Population Health Sciences, University of Bristol, Bristol, United Kingdom.

NIHR Bristol Biomedical Research Centre, Bristol, United Kingdom.

出版信息

JMIR Hum Factors. 2025 May 29;12:e69817. doi: 10.2196/69817.

Abstract

BACKGROUND

Social networking site (SNS) users may experience mental health difficulties themselves or engage with mental health-related content on these platforms. While SNSs use moderation systems and user tools to limit harmful content availability, concerns persist regarding the implementation and effectiveness of these methods.

OBJECTIVE

This study aimed to use an ethnographic walkthrough method to critically evaluate 4 SNSs-Instagram, TikTok, Tumblr, and Tellmi.

METHODS

Walkthrough methods were used to identify and analyze mental health content moderation and safety and well-being resources of SNS platforms. We completed systematic checklists for each of the SNS platforms and then used thematic analysis to interpret the data.

RESULTS

Findings highlighted both successes and challenges in balancing user safety and content moderation across platforms. While varied mental health resources were available on platforms, several issues emerged, including redundancy of information, broken links, and a lack of non-US-centric resources. In addition, despite the presence of several self-moderation tool options, there was insufficient evidence of user education and testing around these features, potentially limiting their effectiveness. Platforms also faced difficulties addressing harmful mental health content due to unclear language around what was allowed or disallowed. This was especially evident in the management of mental health-related terminology, where the emergence of "algospeak," where users adopt alternative codewords or phrases to avoid having content removed or banned by moderation systems, highlighted how users easily bypass platform censorship. Furthermore, platforms did not detail support for reporters or reportees of mental health-related content, leaving users susceptible.

CONCLUSIONS

Our study resulted in the production of preliminary recommendations for platforms regarding potential mental health content moderation and well-being procedures and tools. We also emphasized the need for more inclusive user-centered design, feedback, and research to improve SNS safety and moderation features.

摘要

背景

社交网站(SNS)用户自身可能会经历心理健康问题,或者在这些平台上接触与心理健康相关的内容。虽然社交网站使用审核系统和用户工具来限制有害内容的传播,但人们对这些方法的实施和有效性仍存在担忧。

目的

本研究旨在运用人种志浏览方法,对4个社交网站——照片墙(Instagram)、抖音(TikTok)、汤博乐(Tumblr)和Tellmi进行批判性评估。

方法

采用浏览方法来识别和分析社交网络平台的心理健康内容审核以及安全与福祉资源。我们为每个社交网络平台完成了系统检查表,然后运用主题分析来解读数据。

结果

研究结果凸显了在跨平台平衡用户安全和内容审核方面的成功与挑战。虽然平台上有各种各样的心理健康资源,但出现了几个问题,包括信息冗余、链接失效以及缺乏非以美国为中心的资源。此外,尽管有几种自我审核工具选项,但围绕这些功能的用户教育和测试证据不足,可能会限制其有效性。平台在处理有害心理健康内容方面也面临困难,因为关于允许或禁止内容的表述不明确。这在心理健康相关术语的管理中尤为明显,“算法语言”的出现表明,用户采用替代的编码词或短语来避免让内容被审核系统删除或封禁,凸显了用户如何轻易绕过平台审查。此外,平台没有详细说明对心理健康相关内容举报者或被举报者的支持,使用户易受伤害。

结论

我们的研究为平台提出了关于潜在心理健康内容审核以及福祉程序和工具的初步建议。我们还强调需要进行更具包容性的以用户为中心的设计、反馈和研究,以改善社交网络的安全和审核功能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dcbb/12163353/3d98ada93ce2/humanfactors_v12i1e69817_fig1.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验