Suppr超能文献

自我发布的面部图像中的呈现可能会暴露性取向:对研究和隐私的影响。

Presentation in self-posted facial images can expose sexual orientation: Implications for research and privacy.

机构信息

Department of Management and Organization.

出版信息

J Pers Soc Psychol. 2022 May;122(5):806-824. doi: 10.1037/pspa0000294.

Abstract

Recent research has found that facial recognition algorithms can accurately classify people's sexual orientations using naturalistic facial images, highlighting a severe risk to privacy. This article tests whether people of different sexual orientations presented themselves distinctively in photographs, and whether these distinctions revealed their sexual orientation. I found significant differences in self-presentation. For example, gay individuals were on average more likely to wear glasses compared to heterosexual individuals in images uploaded to the dating website. Gay men also uploaded brighter images compared to heterosexual men. To further test how some of these differences drove the classification of sexual orientation, I employed image augmentation or modification techniques. To evaluate whether the image background contributed to classifications, I progressively masked images until only a thin border of image background remained in each facial image. I found that even these pixels classified sexual orientations at rates significantly higher than random chance. I also blurred images, and found that merely three numbers representing the brightness of each color channel classified sexual orientations. These findings contribute to psychological research on sexual orientation by highlighting how people chose to present themselves differently on the dating website according to their sexual orientations, and how these distinctions were used by the algorithm to classify sexual orientations. The findings also expose a privacy risk as they suggest that do-it-yourself data-protection strategies, such as masking and blurring, cannot effectively prevent leakage of sexual orientation information. As consumers are not equipped to protect themselves, the burden of privacy protection should be shifted to companies and governments. (PsycInfo Database Record (c) 2022 APA, all rights reserved).

摘要

最近的研究发现,人脸识别算法可以使用自然面部图像准确地对人的性取向进行分类,这突显了对隐私的严重风险。本文测试了不同性取向的人在照片中是否有独特的自我呈现,以及这些差异是否揭示了他们的性取向。我发现自我呈现存在显著差异。例如,与异性恋个体相比,上传到约会网站的照片中,同性恋个体更有可能戴眼镜。与异性恋男性相比,男同性恋者上传的照片也更明亮。为了进一步测试这些差异中的哪些差异驱动了性取向的分类,我采用了图像增强或修改技术。为了评估图像背景是否对分类有贡献,我逐步对图像进行遮蔽,直到每个面部图像仅保留图像背景的细边框。我发现,即使是这些像素也以远高于随机机会的速率对性取向进行分类。我还模糊了图像,并发现仅代表每个颜色通道亮度的三个数字就可以对性取向进行分类。这些发现通过突出人们根据性取向在约会网站上有意展示不同的自我,以及算法如何利用这些差异对性取向进行分类,为性取向的心理学研究做出了贡献。这些发现还暴露了隐私风险,因为它们表明,像遮蔽和模糊这样的自助式数据保护策略并不能有效地防止性取向信息的泄露。由于消费者没有能力保护自己,隐私保护的负担应该转移到公司和政府身上。(PsycInfo 数据库记录(c)2022 APA,保留所有权利)。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验