Boehm Thomas, Handgraetinger Oliver, Link Juergen, Ploner Ricardo, Voellmy Daniel R, Marincek Borut, Wildermuth Simon
Department of Medical Radiology, Institute of Diagnostic Radiology, University Hospital Zurich, Raemistrasse 100, 8091 Zurich, Switzerland.
Eur Radiol. 2004 May;14(5):908-14. doi: 10.1007/s00330-003-2205-0. Epub 2004 Feb 4.
The methodology and outcome of a hands-on workshop for the evaluation of PACS (picture archiving and communication system) software for a multihospital PACS project are described. The following radiological workstations and web-browser-based image distribution software clients were evaluated as part of a multistep evaluation of PACS vendors in March 2001: Impax DS 3000 V 4.1/Impax Web1000 (Agfa-Gevaert, Mortsel, Belgium); PathSpeed V 8.0/PathSpeed Web (GE Medical Systems, Milwaukee, Wis., USA); ID Report/ID Web (Image Devices, Idstein, Germany); EasyVision DX/EasyWeb (Philips Medical Systems, Eindhoven, Netherlands); and MagicView 1000 VB33a/MagicWeb (Siemens Medical Systems, Erlangen, Germany). A set of anonymized DICOM test data was provided to enable direct image comparison. Radiologists ( n=44) evaluated the radiological workstations and nonradiologists ( n=53) evaluated the image distribution software clients using different questionnaires. One vendor was not able to import the provided DICOM data set. Another vendor had problems in displaying imported cross-sectional studies in the correct stack order. Three vendors (Agfa-Gevaert, GE, Philips) presented server-client solutions with web access. Two (Siemens, Image Devices) presented stand-alone solutions. The highest scores in the class of radiological workstations were achieved by ID Report from Image Devices ( p<0.005). In the class of image distribution clients, the differences were statistically not significant. Questionnaire-based evaluation was shown to be useful for guaranteeing systematic assessment. The workshop was a great success in raising interest in the PACS project in a large group of future clinical users. The methodology used in the present study may be useful for other hospitals evaluating PACS.
本文描述了一个针对多医院PACS(图像存档与通信系统)项目的PACS软件评估实践工作坊的方法和结果。作为2001年3月对PACS供应商多步骤评估的一部分,对以下放射学工作站和基于网络浏览器的图像分发软件客户端进行了评估:Impax DS 3000 V 4.1/Impax Web1000(爱克发 - 吉华集团,比利时莫特塞尔);PathSpeed V 8.0/PathSpeed Web(通用电气医疗系统公司,美国威斯康星州密尔沃基);ID Report/ID Web(图像设备公司,德国伊德施泰因);EasyVision DX/EasyWeb(飞利浦医疗系统公司,荷兰埃因霍温);以及MagicView 1000 VB33a/MagicWeb(西门子医疗系统公司,德国埃尔兰根)。提供了一组匿名的DICOM测试数据以实现直接图像比较。放射科医生(n = 44)评估放射学工作站,非放射科医生(n = 53)使用不同问卷评估图像分发软件客户端。有一个供应商无法导入提供的DICOM数据集。另一个供应商在以正确的堆叠顺序显示导入的横断面研究时出现问题。三家供应商(爱克发 - 吉华集团、通用电气、飞利浦)提供了具有网络访问功能的服务器 - 客户端解决方案。两家(西门子、图像设备公司)提供了独立解决方案。图像设备公司的ID Report在放射学工作站类别中获得最高分(p < 0.005)。在图像分发客户端类别中,差异无统计学意义。基于问卷的评估被证明对保证系统评估很有用。该工作坊在提高一大群未来临床用户对PACS项目的兴趣方面非常成功。本研究中使用的方法可能对其他评估PACS的医院有用。