• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

评估在线脊髓刺激患者教育材料的可读性、质量和可靠性。

Evaluating the Readability, Quality, and Reliability of Online Patient Education Materials on Spinal Cord Stimulation.

机构信息

Faculty of Medicine and University Hospital, Department of Stereotactic and Functional Neurosurgery, Cologne, Germany.

出版信息

Turk Neurosurg. 2024;34(4):588-599. doi: 10.5137/1019-5149.JTN.42973-22.3.

DOI:10.5137/1019-5149.JTN.42973-22.3
PMID:38874237
Abstract

AIM

To obtain health-related information internet usage is rapidly increasing. However, there are concerns about the comprehensibility and reliability of internet-accessed health-related information. The aim of this research was to investigate the reliability, quality, and readability of patient education materials (PEMs) about spinal cord stimulation (SCS) on the internet.

MATERIAL AND METHODS

A total of 114 websites suitable for the study were identified after a search on Google for the term "spinal cord stimulation." Gunning Fog (GFOG), Flesch-Kincaid Grade Level (FKGL), Flesch Reading Ease Score (FRES), and Simple Measure of Gobbledygook (SMOG) were used to determine the readability of sites. The credibility of the websites was assessed using the Journal of the American Medical Association (JAMA) score. Quality was assessed using the global quality score (GQS), the DISCERN score, and the Health on the Net Foundation code of conduct (HONcode).

RESULTS

Evaluating the text sections, the mean SMOG and FKGL were 10.92 ± 1.61 and 11.62 ± 2.11 years, respectively, and the mean FRES and GFOG were 45.32 ± 10.71 and 14.62 ± 2.24 (both very difficult), respectively. Of all the websites, 10.5% were found to be of high quality, 13.2% were found to be of high reliability, and only 6.1% had a HONcode. A significant difference was found between the typologies of the websites and the reliability and quality scores (p < 0.05).

CONCLUSION

The internet-based PEMs about SCS were found to have a readability level that exceeded the Grade 6 level recommended by the National Health Institute. However, the materials demonstrated low reliability and poor quality. It is advisable that websites addressing Spinal Cord Stimulation (SCS), a specific neuromodulation technique among various interventional strategies for chronic pain management, maintain readability standards in line with established indexes and provide content that is reliable and tailored to the general public's educational level.

摘要

目的

随着人们对健康相关信息的需求不断增加,互联网的使用也在迅速普及。然而,人们对互联网上获取的健康相关信息的可理解性和可靠性存在担忧。本研究旨在调查互联网上有关脊髓刺激(SCS)的患者教育材料(PEM)的可靠性、质量和可读性。

材料和方法

通过在 Google 上搜索“脊髓刺激”一词,共确定了 114 个适合研究的网站。使用 Gunning Fog(GFOG)、Flesch-Kincaid 年级水平(FKGL)、Flesch 阅读舒适度得分(FRES)和简单衡量佶屈聱牙度(SMOG)来确定网站的可读性。使用《美国医学会杂志》(JAMA)评分来评估网站的可信度。使用全球质量评分(GQS)、DISCERN 评分和健康网络基金会行为准则(HONcode)来评估质量。

结果

评估文本部分,平均 SMOG 和 FKGL 分别为 10.92 ± 1.61 和 11.62 ± 2.11 年,平均 FRES 和 GFOG 分别为 45.32 ± 10.71 和 14.62 ± 2.24(均为非常困难)。在所有网站中,有 10.5%被认为是高质量的,13.2%被认为是高可靠性的,只有 6.1%有 HONcode。网站类型与可靠性和质量评分之间存在显著差异(p < 0.05)。

结论

互联网上有关 SCS 的 PEM 阅读水平超过了国家卫生研究院推荐的 6 年级水平。然而,这些材料的可靠性和质量较差。建议针对脊髓刺激(SCS)的网站,作为慢性疼痛管理的各种介入策略之一的特定神经调节技术,保持与既定指标相符的可读性标准,并提供适合普通公众教育水平的可靠内容。

相似文献

1
Evaluating the Readability, Quality, and Reliability of Online Patient Education Materials on Spinal Cord Stimulation.评估在线脊髓刺激患者教育材料的可读性、质量和可靠性。
Turk Neurosurg. 2024;34(4):588-599. doi: 10.5137/1019-5149.JTN.42973-22.3.
2
Assessing parental comprehension of online resources on childhood pain.评估父母对儿童疼痛在线资源的理解程度。
Medicine (Baltimore). 2024 Jun 21;103(25):e38569. doi: 10.1097/MD.0000000000038569.
3
Evaluating the readability, quality and reliability of online patient education materials on post-covid pain.评估在线新冠后疼痛患者教育材料的易读性、质量和可靠性。
PeerJ. 2022 Jul 20;10:e13686. doi: 10.7717/peerj.13686. eCollection 2022.
4
Evaluating the readability, quality and reliability of online patient education materials on transcutaneuous electrical nerve stimulation (TENS).评估经皮神经电刺激(TENS)在线患者教育材料的可读性、质量和可靠性。
Medicine (Baltimore). 2023 Apr 21;102(16):e33529. doi: 10.1097/MD.0000000000033529.
5
Evaluating the readability, quality and reliability of online patient education materials on chronic low back pain.评估在线慢性下背痛患者教育资料的可读性、质量和可靠性。
Natl Med J India. 2024 May-Jun;37(3):124-130. doi: 10.25259/NMJI_327_2022.
6
Evaluating the readability, quality and reliability of online information on Behçet's disease.评估 Behçet 病相关在线信息的可读性、质量和可靠性。
Reumatismo. 2022 Sep 13;74(2). doi: 10.4081/reumatismo.2022.1495.
7
How readable and quality are online patient education materials about Helicobacter pylori?: Assessment of the readability, quality and reliability.关于幽门螺杆菌的在线患者教育材料的可读性和质量如何?:评估可读性、质量和可靠性。
Medicine (Baltimore). 2023 Oct 27;102(43):e35543. doi: 10.1097/MD.0000000000035543.
8
Assessment of online patient education materials from major ophthalmologic associations.主要眼科协会在线患者教育材料评估。
JAMA Ophthalmol. 2015 Apr;133(4):449-54. doi: 10.1001/jamaophthalmol.2014.6104.
9
Readability assessment of online tracheostomy care resources.在线气管造口护理资源的可读性评估。
Otolaryngol Head Neck Surg. 2015 Feb;152(2):272-8. doi: 10.1177/0194599814560338. Epub 2014 Dec 1.
10
Quality, Reliability, Readability, and Accountability of Online Information on Leukocoria.关于白瞳症的在线信息的质量、可靠性、可读性和可问责性。
J Pediatr Ophthalmol Strabismus. 2024 Sep-Oct;61(5):332-338. doi: 10.3928/01913913-20240425-02. Epub 2024 May 30.

引用本文的文献

1
Evaluating the readability, quality, and reliability of responses generated by ChatGPT, Gemini, and Perplexity on the most commonly asked questions about Ankylosing spondylitis.评估ChatGPT、Gemini和Perplexity针对强直性脊柱炎最常见问题生成的回答的可读性、质量和可靠性。
PLoS One. 2025 Jun 18;20(6):e0326351. doi: 10.1371/journal.pone.0326351. eCollection 2025.
2
How Successful Is AI in Developing Postsurgical Wound Care Education Material?人工智能在开发术后伤口护理教育材料方面有多成功?
Wound Repair Regen. 2025 May-Jun;33(3):e70041. doi: 10.1111/wrr.70041.
3
Assessing the readability, quality and reliability of responses produced by ChatGPT, Gemini, and Perplexity regarding most frequently asked keywords about low back pain.
评估ChatGPT、Gemini和Perplexity针对有关腰痛的最常见关键词所给出回答的可读性、质量和可靠性。
PeerJ. 2025 Jan 22;13:e18847. doi: 10.7717/peerj.18847. eCollection 2025.