Department of Radiology, Washington University School of Medicine, St. Louis, MO 63110, USA.
Neuroimage. 2013 Oct 15;80:202-19. doi: 10.1016/j.neuroimage.2013.05.077. Epub 2013 May 24.
The Human Connectome Project (HCP) has developed protocols, standard operating and quality control procedures, and a suite of informatics tools to enable high throughput data collection, data sharing, automated data processing and analysis, and data mining and visualization. Quality control procedures include methods to maintain data collection consistency over time, to measure head motion, and to establish quantitative modality-specific overall quality assessments. Database services developed as customizations of the XNAT imaging informatics platform support both internal daily operations and open access data sharing. The Connectome Workbench visualization environment enables user interaction with HCP data and is increasingly integrated with the HCP's database services. Here we describe the current state of these procedures and tools and their application in the ongoing HCP study.
人类连接组计划(HCP)制定了协议、标准操作和质量控制程序,以及一套信息学工具,以实现高通量数据采集、数据共享、自动化数据处理和分析,以及数据挖掘和可视化。质量控制程序包括保持数据采集随时间一致性的方法,测量头部运动的方法,以及建立定量的模态特定整体质量评估的方法。作为 XNAT 成像信息学平台定制开发的数据库服务,既支持内部日常运营,也支持开放访问数据共享。Connectome Workbench 可视化环境支持用户与 HCP 数据进行交互,并且越来越多地与 HCP 的数据库服务集成。本文描述了这些程序和工具的当前状态及其在正在进行的 HCP 研究中的应用。