Tritt Andrew J, Rübel Oliver, Dichter Benjamin, Ly Ryan, Kang Donghe, Chang Edward F, Frank Loren M, Bouchard Kristofer
Computational Research Division, Lawrence Berkeley National Laboratory, Berkeley, CA, USA.
Biological Systems and Engineering, Lawrence Berkeley National Laboratory, Berkeley, CA, USA.
Proc IEEE Int Conf Big Data. 2019 Dec;2019:165-179. doi: 10.1109/bigdata47090.2019.9005648. Epub 2020 Feb 24.
A ubiquitous problem in aggregating data across different experimental and observational data sources is a lack of software infrastructure that enables flexible and extensible standardization of data and metadata. To address this challenge, we developed HDMF, a hierarchical data modeling framework for modern science data standards. With HDMF, we separate the process of data standardization into three main components: (1) data modeling and specification, (2) data I/O and storage, and (3) data interaction and data APIs. To enable standards to support the complex requirements and varying use cases throughout the data life cycle, HDMF provides object mapping infrastructure to insulate and integrate these various components. This approach supports the flexible development of data standards and extensions, optimized storage backends, and data APIs, while allowing the other components of the data standards ecosystem to remain stable. To meet the demands of modern, large-scale science data, HDMF provides advanced data I/O functionality for iterative data write, lazy data load, and parallel I/O. It also supports optimization of data storage via support for chunking, compression, linking, and modular data storage. We demonstrate the application of HDMF in practice to design NWB 2.0 [13], a modern data standard for collaborative science across the neurophysiology community.
在整合来自不同实验和观测数据源的数据时,一个普遍存在的问题是缺乏能够实现数据和元数据灵活且可扩展标准化的软件基础设施。为应对这一挑战,我们开发了HDMF,这是一个用于现代科学数据标准的分层数据建模框架。借助HDMF,我们将数据标准化过程分为三个主要组件:(1)数据建模与规范,(2)数据输入/输出与存储,以及(3)数据交互与数据应用程序编程接口。为使标准能够支持数据生命周期中的复杂需求和不同用例,HDMF提供对象映射基础设施来隔离和集成这些不同组件。这种方法支持数据标准及其扩展的灵活开发、优化的存储后端和数据应用程序编程接口,同时让数据标准生态系统的其他组件保持稳定。为满足现代大规模科学数据的需求,HDMF提供了用于迭代数据写入、惰性数据加载和并行输入/输出的高级数据输入/输出功能。它还通过支持分块、压缩、链接和模块化数据存储来支持数据存储优化。我们展示了HDMF在实践中的应用,以设计NWB 2.0 [13],这是神经生理学领域用于协作科学的现代数据标准。