Sports Surgery Clinic, Dublin, Ireland.
Royal College of Surgeons in Ireland.
Ann R Coll Surg Engl. 2023 May;105(5):394-399. doi: 10.1308/rcsann.2022.0024. Epub 2022 May 31.
The purpose of this study was to evaluate the quality and readability of information online for patients searching terms related to arthroscopic Bankart repair (ABR).
Google, Yahoo and Bing were searched with terms related to ABR. The quality of information was assessed using the Journal of the American Medical Association (JAMA) Benchmark criteria, DISCERN Score, and the Flesch-Kincaid Reading Ease & Grade Level. The presence of the HONcode marker was noted. Additionally, we used a scoring system specific to content relating to the ABR (AB score), a 1-20 Likert scale. Websites were also categorised according to the source: academic institution, physician, allied healthcare, commercial, media or social media. Statistical analysis was performed using GraphPad Prism.
Ninety-six unique websites were evaluated, with the most common website category being physician websites (52) and academic institution websites (24). There were nine websites with the HONcode marker. The average JAMA Benchmark criteria score was 1.95 (1-4), the average DISCERN score was 48.8 (20-78), with an average Flesch-Kincaid reading ease of 50.9 (11-96) and grade level of 8 (1-18). The average AB score was 5.9 (0-18), and there was a strong correlation with a higher DISCERN score (=0.57), but not JAMA score (=0.18) or Flesch-Kincaid grade (=0.16). Websites with the HONcode marker did not score higher in any criteria than those without it (>0.05). The quality of information on physician websites was better than (statistically insignificant) non-physician websites; however, the readability of information in the former was poorer (statistically significant) than the latter.
There was wide variability in the quality and readability online of the information on ABR, and the AB scoring system was shown to correlate strongly with increased quality.
本研究旨在评估患者搜索与关节镜下 Bankart 修复(ABR)相关术语时,在线信息的质量和可读性。
使用与 ABR 相关的术语在 Google、Yahoo 和 Bing 上进行搜索。使用《美国医学会杂志》(JAMA)基准标准、DISCERN 评分和 Flesch-Kincaid 阅读舒适度和年级水平来评估信息质量。还注意了 HONcode 标记的存在。此外,我们使用了特定于与 ABR 相关内容的评分系统(AB 评分),即 1-20 的 Likert 量表。根据来源对网站进行分类:学术机构、医生、辅助医疗保健、商业、媒体或社交媒体。使用 GraphPad Prism 进行统计分析。
评估了 96 个独特的网站,最常见的网站类别是医生网站(52 个)和学术机构网站(24 个)。有 9 个网站带有 HONcode 标记。JAMA 基准标准评分的平均值为 1.95(1-4),DISCERN 评分的平均值为 48.8(20-78),Flesch-Kincaid 阅读舒适度的平均值为 50.9(11-96),年级水平为 8(1-18)。AB 评分的平均值为 5.9(0-18),与更高的 DISCERN 评分呈强相关(=0.57),但与 JAMA 评分(=0.18)或 Flesch-Kincaid 等级(=0.16)无关。带有 HONcode 标记的网站在任何标准上的评分都不比没有 HONcode 标记的网站高(>0.05)。医生网站上的信息质量优于(统计学上无显著性)非医生网站;然而,前者的信息可读性较差(统计学上显著)。
ABR 在线信息的质量和可读性差异很大,AB 评分系统与质量的提高密切相关。