Gaies Michael, Olive Mary K, Owens Gabe E, Charpie John R, Zhang Wenying, Pasquali Sara K, Klugman Darren, Costello John M, Schwartz Steven M, Banerjee Mousumi
Heart Institute, Cincinnati Children's Hospital Medical Center, OH (M.G.).
Department of Pediatrics, University of Michigan Medical School, Ann Arbor, MI (M.K.O., G.E.O., J.R.C., S.K.P.).
Circ Cardiovasc Qual Outcomes. 2023 Feb;16(2):e009277. doi: 10.1161/CIRCOUTCOMES.122.009277. Epub 2023 Feb 2.
Hospitals are increasingly likely to implement clinical informatics tools to improve quality of care, necessitating rigorous approaches to evaluate effectiveness. We leveraged a multi-institutional data repository and applied causal inference methods to assess implementation of a commercial data visualization software in our pediatric cardiac intensive care unit.
Natural experiment in the University of Michigan (UM) Cardiac Intensive Care Unit pre and postimplementation of data visualization software analyzed within the Pediatric Cardiac Critical Care Consortium clinical registry; we identified N=21 control hospitals that contributed contemporaneous registry data during the study period. We used the platform during multiple daily rounds to visualize clinical data trends. We evaluated outcomes-case-mix adjusted postoperative mortality, cardiac arrest and unplanned readmission rates, and postoperative length of stay-most likely impacted by this change. There were no quality improvement initiatives focused specifically on these outcomes nor any organizational changes at UM in either era. We performed a difference-in-differences analysis to compare changes in UM outcomes to those at control hospitals across the pre versus postimplementation eras.
We compared 1436 pre versus 779 postimplementation admissions at UM to 19 854 (pre) versus 14 160 (post) at controls. Admission characteristics were similar between eras. Postimplementation at UM we observed relative reductions in cardiac arrests among medical admissions, unplanned readmissions, and postoperative length of stay by -14%, -41%, and -18%, respectively. The difference-in-differences estimate for each outcome was statistically significant (<0.05), suggesting the difference in outcomes at UM pre versus postimplementation is statistically significantly different from control hospitals during the same time.
Clinical registries provide opportunities to thoroughly evaluate implementation of new informatics tools at single institutions. Borrowing strength from multi-institutional data and drawing ideas from causal inference, our analysis solidified greater belief in the effectiveness of this software across our institution.
医院越来越倾向于采用临床信息学工具来提高医疗质量,因此需要严谨的方法来评估其有效性。我们利用一个多机构数据存储库,并应用因果推断方法来评估一款商业数据可视化软件在我们儿科心脏重症监护病房的实施情况。
在密歇根大学(UM)心脏重症监护病房对数据可视化软件实施前后进行自然实验,该实验在儿科心脏重症监护联盟临床登记处内进行分析;我们确定了N = 21家对照医院,这些医院在研究期间提供了同期登记数据。我们在每日多次查房期间使用该平台来可视化临床数据趋势。我们评估了最有可能受此变化影响的结果——病例组合调整后的术后死亡率、心脏骤停和非计划再入院率,以及术后住院时间。在这两个时期,UM均没有专门针对这些结果的质量改进举措,也没有任何组织变革。我们进行了差异分析,以比较UM的结果变化与对照医院在实施前和实施后时期的结果变化。
我们将UM实施前的1436例入院病例与实施后的779例入院病例,与对照医院实施前的19854例和实施后的14160例进行了比较。各时期的入院特征相似。在UM实施后,我们观察到内科入院患者中的心脏骤停、非计划再入院和术后住院时间分别相对减少了14%、41%和18%。每个结果的差异估计在统计学上具有显著性(<0.05),这表明UM实施前和实施后的结果差异与同期对照医院在统计学上有显著差异。
临床登记处为全面评估单一机构中新信息学工具的实施情况提供了机会。通过借鉴多机构数据的优势并从因果推断中汲取思路,我们的分析更加坚定了对该软件在我们机构有效性的信心。