• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过四本循证医学教科书(ACP PIER、Essential Evidence Plus、First Consult 和 UpToDate)检索答案的比较:一项随机对照试验。

A comparison of answer retrieval through four evidence-based textbooks (ACP PIER, Essential Evidence Plus, First Consult, and UpToDate): a randomized controlled trial.

机构信息

Tehran University of Medical Sciences, Iran.

出版信息

Med Teach. 2011;33(9):724-30. doi: 10.3109/0142159X.2010.531155.

DOI:10.3109/0142159X.2010.531155
PMID:21854150
Abstract

BACKGROUND

The efficacy of bedside information products has not been properly evaluated, particularly in developing countries.

AIM

To compare four evidence-based textbooks by comparing efficacy of their use by clinical residents, as measured by the proportion of questions for which relevant answers could be obtained within 20 min, the time to reach the answer and user satisfaction.

METHODS

One hundred and twelve residents were taught information mastery basics and were randomly allocated to four groups to use: (1) ACP PIER, (2) Essential Evidence Plus (formerly InfoRetriever), (3) First Consult, and (4) UpToDate. Participants received 3 of 24 questions randomly to retrieve the answers from the assigned textbook. Retrieved answers and time-to-answers were recorded by special designed software, and the researchers determined if each recorded answer was relevant.

RESULTS

The rate of answer retrieval was 86% in UpToDate, 69% in First Consult, 49% in ACP PIER, and 45% in Essential Evidence Plus (p < 0.001). The mean time-to-answer was 14.6 min using UpToDate, 15.9 min using First Consult, 16.3 min using Essential Evidence Plus, and 17.3 min using ACP PIER (p < 0.001).

CONCLUSION

UpToDate seems more comprehensive in content and also faster than the other three evidence-based textbooks. Thus, it may be considered as one of the best sources for answering clinicians' questions at the point of care.

摘要

背景

床边信息产品的疗效尚未得到适当评估,尤其是在发展中国家。

目的

通过比较临床住院医师使用四种基于证据的教科书的效果,来比较它们的疗效,以在 20 分钟内获得相关答案的问题比例、获得答案的时间和用户满意度来衡量。

方法

112 名住院医师接受了信息掌握基础知识的培训,并随机分为四组使用:(1)ACP PIER,(2)Essential Evidence Plus(前身为 InfoRetriever),(3)First Consult,和(4)UpToDate。参与者随机收到 3 个 24 个问题中的问题,从指定的教科书中检索答案。答案和答案时间由专门设计的软件记录,研究人员确定每个记录的答案是否相关。

结果

UpToDate 的答案检索率为 86%,First Consult 为 69%,ACP PIER 为 49%,Essential Evidence Plus 为 45%(p<0.001)。使用 UpToDate 的平均回答时间为 14.6 分钟,使用 First Consult 的平均回答时间为 15.9 分钟,使用 Essential Evidence Plus 的平均回答时间为 16.3 分钟,使用 ACP PIER 的平均回答时间为 17.3 分钟(p<0.001)。

结论

UpToDate 在内容上似乎更全面,也比其他三种基于证据的教科书更快。因此,它可以被认为是在护理点回答临床医生问题的最佳来源之一。

相似文献

1
A comparison of answer retrieval through four evidence-based textbooks (ACP PIER, Essential Evidence Plus, First Consult, and UpToDate): a randomized controlled trial.通过四本循证医学教科书(ACP PIER、Essential Evidence Plus、First Consult 和 UpToDate)检索答案的比较:一项随机对照试验。
Med Teach. 2011;33(9):724-30. doi: 10.3109/0142159X.2010.531155.
2
Answering questions at the point of care: do residents practice EBM or manage information sources?在医疗现场回答问题:住院医师是践行循证医学还是管理信息来源?
Acad Med. 2007 Mar;82(3):298-303. doi: 10.1097/ACM.0b013e3180307fed.
3
Evaluation of e-textbooks. DynaMed, MD Consult and UpToDate.电子教科书评估。DynaMed、MD Consult和UpToDate。
Aust Fam Physician. 2008 Oct;37(10):878-82.
4
To compare PubMed Clinical Queries and UpToDate in teaching information mastery to clinical residents: a crossover randomized controlled trial.比较 PubMed Clinical Queries 和 UpToDate 在临床住院医师教学信息掌握方面的效果:一项交叉随机对照试验。
PLoS One. 2011;6(8):e23487. doi: 10.1371/journal.pone.0023487. Epub 2011 Aug 12.
5
Speed, accuracy, and confidence in Google, Ovid, PubMed, and UpToDate: results of a randomised trial.在谷歌、Ovid、PubMed 和 UpToDate 中的速度、准确性和信心:一项随机试验的结果。
Postgrad Med J. 2010 Aug;86(1018):459-65. doi: 10.1136/pgmj.2010.098053.
6
PIER- evidence-base medicine from ACP.来自美国内科医师学会的PIER循证医学
Med Ref Serv Q. 2004 Fall;23(3):39-48. doi: 10.1300/J115v23n03_05.
7
Randomized trial for answers to clinical questions: evaluating a pre-appraised versus a MEDLINE search protocol.针对临床问题答案的随机试验:评估预评估方案与MEDLINE检索方案。
J Med Libr Assoc. 2006 Oct;94(4):382-7.
8
Using the World Wide Web to answer clinical questions: how efficient are different methods of information retrieval?利用万维网回答临床问题:不同信息检索方法的效率如何?
J Fam Pract. 1999 Jul;48(7):520-4.
9
Utility of the electronic information resource UpToDate for clinical decision-making at bedside rounds.电子信息资源 UpToDate 在床边查房时进行临床决策的效用。
Singapore Med J. 2012 Feb;53(2):116-20.
10
An evaluation of five bedside information products using a user-centered, task-oriented approach.采用以用户为中心、面向任务的方法对五种床边信息产品进行评估。
J Med Libr Assoc. 2006 Oct;94(4):435-41, e206-7.

引用本文的文献

1
Accuracy and Safety of ChatGPT-3.5 in Assessing Over-the-Counter Medication Use During Pregnancy: A Descriptive Comparative Study.ChatGPT-3.5评估孕期非处方药物使用的准确性和安全性:一项描述性比较研究。
Pharmacy (Basel). 2025 Jul 30;13(4):104. doi: 10.3390/pharmacy13040104.
2
ChatGPT vs UpToDate: comparative study of usefulness and reliability of Chatbot in common clinical presentations of otorhinolaryngology-head and neck surgery.ChatGPT 与 UpToDate:在耳鼻喉头颈外科常见临床情况下比较评估 Chatbot 的有用性和可靠性的研究。
Eur Arch Otorhinolaryngol. 2024 Apr;281(4):2145-2151. doi: 10.1007/s00405-023-08423-w. Epub 2024 Jan 13.
3
Clinical questions in primary care: Where to find the answers - a cross-sectional study.
基层医疗中的临床问题:答案在哪里——一项横断面研究。
PLoS One. 2022 Nov 11;17(11):e0277462. doi: 10.1371/journal.pone.0277462. eCollection 2022.
4
Utilization and uptake of the UpToDate clinical decision support tool at the Makerere University College of Health Sciences (MakCHS), Uganda.乌干达马凯雷雷大学健康科学学院(MakCHS)对 UpToDate 临床决策支持工具的使用和采纳情况。
Afr Health Sci. 2021 Jun;21(2):904-911. doi: 10.4314/ahs.v21i2.52.
5
Comparison of the Impact of Wikipedia, UpToDate, and a Digital Textbook on Short-Term Knowledge Acquisition Among Medical Students: Randomized Controlled Trial of Three Web-Based Resources.维基百科、UpToDate和数字教科书对医学生短期知识获取影响的比较:三种基于网络资源的随机对照试验
JMIR Med Educ. 2017 Oct 31;3(2):e20. doi: 10.2196/mededu.8188.
6
Breadth of Coverage, Ease of Use, and Quality of Mobile Point-of-Care Tool Information Summaries: An Evaluation.移动医疗点工具信息摘要的覆盖广度、易用性及质量:一项评估
JMIR Mhealth Uhealth. 2016 Oct 12;4(4):e117. doi: 10.2196/mhealth.6189.
7
Meeting physicians' needs: a bottom-up approach for improving the implementation of medical knowledge into practice.满足医生的需求:一种自下而上的方法,用于改善医学知识在实践中的应用。
Health Res Policy Syst. 2016 Jul 18;14(1):49. doi: 10.1186/s12961-016-0120-5.
8
Evaluating the appropriateness of electronic information resources for learning.评估用于学习的电子信息资源的适用性。
J Med Libr Assoc. 2016 Jan;104(1):24-32. doi: 10.3163/1536-5050.104.1.004.
9
Analysis of PubMed User Sessions Using a Full-Day PubMed Query Log: A Comparison of Experienced and Nonexperienced PubMed Users.利用全天 PubMed 查询日志分析 PubMed 用户会话:有经验和无经验 PubMed 用户的比较。
JMIR Med Inform. 2015 Jul 2;3(3):e25. doi: 10.2196/medinform.3740.
10
Speed and accuracy of a point of care web-based knowledge resource for clinicians: a controlled crossover trial.面向临床医生的基于网络的即时医疗知识资源的速度和准确性:一项对照交叉试验。
Interact J Med Res. 2014 Feb 21;3(1):e7. doi: 10.2196/ijmr.2811.