Suppr超能文献

摘要分析方法有助于筛选银屑病干预措施中方法学质量低和偏倚风险高的系统评价。

Abstract analysis method facilitates filtering low-methodological quality and high-bias risk systematic reviews on psoriasis interventions.

机构信息

Department of Dermatology, Reina Sofía University Hospital, Menendez Pidal Ave, Córdoba, 14005, Spain.

IMIBIC/Reina Sofía University Hospital/UNiversity of Córdoba, Menendez Pidal Ave, Córdoba, 14005, Spain.

出版信息

BMC Med Res Methodol. 2017 Dec 29;17(1):180. doi: 10.1186/s12874-017-0460-z.

Abstract

BACKGROUND

Article summaries' information and structure may influence researchers/clinicians' decisions to conduct deeper full-text analyses. Specifically, abstracts of systematic reviews (SRs) and meta-analyses (MA) should provide structured summaries for quick assessment. This study explored a method for determining the methodological quality and bias risk of full-text reviews using abstract information alone.

METHODS

Systematic literature searches for SRs and/or MA about psoriasis were undertaken on MEDLINE, EMBASE, and Cochrane database. For each review, quality, abstract-reporting completeness, full-text methodological quality, and bias risk were evaluated using Preferred Reporting Items for Systematic Reviews and Meta-analyses for abstracts (PRISMA-A), Assessing the Methodological Quality of Systematic Reviews (AMSTAR), and ROBIS tools, respectively. Article-, author-, and journal-derived metadata were systematically extracted from eligible studies using a piloted template, and explanatory variables concerning abstract-reporting quality were assessed using univariate and multivariate-regression models. Two classification models concerning SRs' methodological quality and bias risk were developed based on per-item and total PRISMA-A scores and decision-tree algorithms. This work was supported, in part, by project ICI1400136 (JR). No funding was received from any pharmaceutical company.

RESULTS

This study analysed 139 SRs on psoriasis interventions. On average, they featured 56.7% of PRISMA-A items. The mean total PRISMA-A score was significantly higher for high-methodological-quality SRs than for moderate- and low-methodological-quality reviews. SRs with low-bias risk showed higher total PRISMA-A values than reviews with high-bias risk. In the final model, only 'authors per review > 6' (OR: 1.098; 95%CI: 1.012-1.194), 'academic source of funding' (OR: 3.630; 95%CI: 1.788-7.542), and 'PRISMA-endorsed journal' (OR: 4.370; 95%CI: 1.785-10.98) predicted PRISMA-A variability. Reviews with a total PRISMA-A score < 6, lacking identification as SR or MA in the title, and lacking explanation concerning bias risk assessment methods were classified as low-methodological quality. Abstracts with a total PRISMA-A score ≥ 9, including main outcomes results and explanation bias risk assessment method were classified as having low-bias risk.

CONCLUSIONS

The methodological quality and bias risk of SRs may be determined by abstract's quality and completeness analyses. Our proposal aimed to facilitate synthesis of evidence evaluation by clinical professionals lacking methodological skills. External validation is necessary.

摘要

背景

文章摘要的信息和结构可能会影响研究人员/临床医生进行更深入的全文分析的决策。具体来说,系统评价(SR)和荟萃分析(MA)的摘要应提供结构化摘要,以便快速评估。本研究探讨了一种仅使用摘要信息确定全文综述方法学质量和偏倚风险的方法。

方法

在 MEDLINE、EMBASE 和 Cochrane 数据库中进行了关于银屑病的 SR 和/或 MA 的系统文献检索。对于每篇综述,使用系统评价和荟萃分析摘要的首选报告项目(PRISMA-A)、评估系统评价方法学质量(AMSTAR)和 ROBIS 工具分别评估质量、摘要报告完整性、全文方法学质量和偏倚风险。使用经过试点的模板从合格研究中系统地提取了文章、作者和期刊衍生的元数据,并使用单变量和多变量回归模型评估了与摘要报告质量相关的解释变量。基于每项和总 PRISMA-A 分数以及决策树算法,为 SR 的方法学质量和偏倚风险开发了两个分类模型。这项工作部分得到了项目 ICI1400136(JR)的支持。没有收到任何制药公司的资助。

结果

本研究分析了 139 篇关于银屑病干预措施的 SR。平均而言,它们具有 56.7%的 PRISMA-A 项目。高质量 SR 的总 PRISMA-A 评分明显高于中低质量的综述。低偏倚风险的 SR 总 PRISMA-A 值高于高偏倚风险的综述。在最终模型中,只有“每位作者的综述>6”(OR:1.098;95%CI:1.012-1.194)、“学术资金来源”(OR:3.630;95%CI:1.788-7.542)和“PRISMA 认可的期刊”(OR:4.370;95%CI:1.785-10.98)预测了 PRISMA-A 的可变性。总 PRISMA-A 评分<6、标题中未注明 SR 或 MA 且未说明偏倚风险评估方法的综述被归类为低方法学质量。总 PRISMA-A 评分≥9、包含主要结局结果和解释偏倚风险评估方法的摘要被归类为低偏倚风险。

结论

SR 的方法学质量和偏倚风险可以通过摘要的质量和完整性分析来确定。我们的建议旨在为缺乏方法学技能的临床专业人员提供证据评估的综合便利。需要进行外部验证。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3606/5747101/8458ced3e565/12874_2017_460_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验