Oxford Uehiro Centre for Practical Ethics, University of Oxford, Oxford, United Kingdom
Monash Centre for Health Research and Implementation, Monash University, Clayton, Victoria, Australia.
BMJ Health Care Inform. 2020 Oct;27(3). doi: 10.1136/bmjhci-2020-100175.
Suicide poses a significant health burden worldwide. In many cases, people at risk of suicide do not engage with their doctor or community due to concerns about stigmatisation and forced medical treatment; worse still, people with mental illness (who form a majority of people who die from suicide) may have poor insight into their mental state, and not self-identify as being at risk. These issues are exacerbated by the fact that doctors have difficulty in identifying those at risk of suicide when they do present to medical services. Advances in artificial intelligence (AI) present opportunities for the development of novel tools for predicting suicide. We searched Google Scholar and PubMed for articles relating to suicide prediction using artificial intelligence from 2017 onwards. This paper presents a qualitative narrative review of research focusing on two categories of suicide prediction tools: medical suicide prediction and social suicide prediction. Initial evidence is promising: AI-driven suicide prediction could improve our capacity to identify those at risk of suicide, and, potentially, save lives. Medical suicide prediction may be relatively uncontroversial when it pays respect to ethical and legal principles; however, further research is required to determine the validity of these tools in different contexts. Social suicide prediction offers an exciting opportunity to help identify suicide risk among those who do not engage with traditional health services. Yet, efforts by private companies such as Facebook to use online data for suicide prediction should be the subject of independent review and oversight to confirm safety, effectiveness and ethical permissibility.
自杀在全球范围内造成了重大的健康负担。在许多情况下,由于担心污名化和强制治疗,有自杀风险的人不会与他们的医生或社区接触;更糟糕的是,大多数死于自杀的人都患有精神疾病,他们可能对自己的精神状态缺乏洞察力,并且不自认为处于危险之中。由于医生在医疗服务机构中很难识别有自杀风险的人,这些问题更加严重。人工智能 (AI) 的进步为开发预测自杀的新工具提供了机会。我们在 Google Scholar 和 PubMed 上搜索了 2017 年以来使用人工智能预测自杀的文章。本文对专注于两类自杀预测工具的研究进行了定性叙述性综述:医疗自杀预测和社会自杀预测。初步证据令人鼓舞:人工智能驱动的自杀预测可以提高我们识别自杀风险人群的能力,并有可能拯救生命。当医疗自杀预测尊重伦理和法律原则时,可能相对没有争议;然而,需要进一步研究来确定这些工具在不同环境中的有效性。社会自杀预测为帮助识别那些不参与传统卫生服务的人提供了一个令人兴奋的机会。然而,Facebook 等私营公司利用在线数据进行自杀预测的努力应该受到独立审查和监督,以确认其安全性、有效性和道德许可性。
BMJ Health Care Inform. 2020-10
Aust N Z J Psychiatry. 2019-7-26
Int J Environ Res Public Health. 2020-8-15
Yearb Med Inform. 2019-8
Asian J Psychiatr. 2023-10
Yearb Med Inform. 2019-8
J Med Internet Res. 2025-1-23
Lancet Reg Health West Pac. 2024-4-9
J Pediatr Rehabil Med. 2023
Int J Environ Res Public Health. 2022-10-11
IEEE J Biomed Health Inform. 2020-7
N Engl J Med. 2020-1-16
Int J Med Inform. 2019-9-23
Psychiatry Investig. 2019-8
Int J Environ Res Public Health. 2019-4-24