抑郁症相关网站质量的自动化评估
Automated assessment of the quality of depression websites.
作者信息
Griffiths Kathleen M, Tang Thanh Tin, Hawking David, Christensen Helen
机构信息
Depression & Anxiety Consumer Research Unit, Centre for Mental Health Research, The Australian National University, Canberra, Australia.
出版信息
J Med Internet Res. 2005 Dec 30;7(5):e59. doi: 10.2196/jmir.7.5.e59.
BACKGROUND
Since health information on the World Wide Web is of variable quality, methods are needed to assist consumers to identify health websites containing evidence-based information. Manual assessment tools may assist consumers to evaluate the quality of sites. However, these tools are poorly validated and often impractical. There is a need to develop better consumer tools, and in particular to explore the potential of automated procedures for evaluating the quality of health information on the web.
OBJECTIVE
This study (1) describes the development of an automated quality assessment procedure (AQA) designed to automatically rank depression websites according to their evidence-based quality; (2) evaluates the validity of the AQA relative to human rated evidence-based quality scores; and (3) compares the validity of Google PageRank and the AQA as indicators of evidence-based quality.
METHOD
The AQA was developed using a quality feedback technique and a set of training websites previously rated manually according to their concordance with statements in the Oxford University Centre for Evidence-Based Mental Health's guidelines for treating depression. The validation phase involved 30 websites compiled from the DMOZ, Yahoo! and LookSmart Depression Directories by randomly selecting six sites from each of the Google PageRank bands of 0, 1-2, 3-4, 5-6 and 7-8. Evidence-based ratings from two independent raters (based on concordance with the Oxford guidelines) were then compared with scores derived from the automated AQA and Google algorithms. There was no overlap in the websites used in the training and validation phases of the study.
RESULTS
The correlation between the AQA score and the evidence-based ratings was high and significant (r=0.85, P<.001). Addition of a quadratic component improved the fit, the combined linear and quadratic model explaining 82 percent of the variance. The correlation between Google PageRank and the evidence-based score was lower than that for the AQA. When sites with zero PageRanks were included the association was weak and non-significant (r=0.23, P=.22). When sites with zero PageRanks were excluded, the correlation was moderate (r=.61, P=.002).
CONCLUSIONS
Depression websites of different evidence-based quality can be differentiated using an automated system. If replicable, generalizable to other health conditions and deployed in a consumer-friendly form, the automated procedure described here could represent an important advance for consumers of Internet medical information.
背景
由于万维网上的健康信息质量参差不齐,需要一些方法来帮助消费者识别包含循证信息的健康网站。人工评估工具或许能帮助消费者评估网站质量。然而,这些工具的有效性欠佳且往往不实用。有必要开发更好的消费者工具,尤其是探索利用自动化程序评估网络健康信息质量的潜力。
目的
本研究(1)描述一种自动化质量评估程序(AQA)的开发过程,该程序旨在根据循证质量对抑郁症相关网站进行自动排名;(2)相对于人工评定的循证质量得分,评估AQA的有效性;(3)比较谷歌网页排名和AQA作为循证质量指标的有效性。
方法
AQA采用质量反馈技术以及一组先前根据与牛津大学循证精神卫生中心治疗抑郁症指南中陈述的一致性进行人工评级的训练网站来开发。验证阶段涉及从DMOZ、雅虎和LookSmart抑郁症目录中编译的30个网站,通过从谷歌网页排名的0、1 - 2、3 - 4、5 - 6和7 - 8各频段中随机选择6个网站组成。然后将两名独立评分者基于与牛津指南的一致性得出的循证评级与自动化AQA和谷歌算法得出的分数进行比较。研究的训练和验证阶段使用的网站没有重叠。
结果
AQA得分与循证评级之间的相关性高且显著(r = 0.85,P <.001)。添加二次成分改善了拟合度,线性和二次组合模型解释了82%的方差。谷歌网页排名与循证得分之间的相关性低于AQA。当纳入网页排名为零的网站时,关联性较弱且不显著(r = 0.23,P =.22)。当排除网页排名为零的网站时,相关性为中等(r =.61,P =.002)。
结论
使用自动化系统可以区分不同循证质量的抑郁症网站。如果可复制、可推广到其他健康状况并以用户友好的形式部署,这里描述的自动化程序可能代表互联网医疗信息消费者的一项重要进展。
相似文献
J Med Internet Res. 2005-12-30
J Med Internet Res. 2005-11-15
Patient Educ Couns. 2008-8
J Affect Disord. 2008-10
Patient Educ Couns. 2009-4-15
引用本文的文献
J Educ Health Promot. 2019-1-29
J Am Med Inform Assoc. 2017-5-1
Evid Based Ment Health. 2016-5
J Med Internet Res. 2015-6-2
Iran Red Crescent Med J. 2014-4
J Med Internet Res. 2014-3-4
J Pathol Inform. 2013-10-29
J Med Internet Res. 2013-12-20
本文引用的文献
J Med Internet Res. 2005-11-15
Bull World Health Organ. 2004-11
Med Inform Internet Med. 2004-6
Med Care. 2003-1
Med J Aust. 2002-5-20