Pawlowski Gabriela M, Ghosh-Hajra Sujoy, Fickling Shaun D, Liu Careesa C, Song Xiaowei, Robinovitch Stephen, Doesburg Sam M, D'Arcy Ryan C N
NeuroTech Laboratory, Faculty of Applied Sciences, Simon Fraser University, Burnaby, BC, Canada.
Biomedical Physiology and Kinesiology, Faculty of Science, Simon Fraser University, Burnaby, BC, Canada.
Front Neurosci. 2019 Jan 18;12:968. doi: 10.3389/fnins.2018.00968. eCollection 2018.
The critical need for rapid objective, physiological evaluation of brain function at point-of-care has led to the emergence of brain vital signs-a framework encompassing a portable electroencephalography (EEG) and an automated, quick test protocol. This framework enables access to well-established event-related potential (ERP) markers, which are specific to sensory, attention, and cognitive functions in both healthy and patient populations. However, all our applications to-date have used auditory stimulation, which have highlighted application challenges in persons with hearing impairments (e.g., aging, seniors, dementia). Consequently, it has become important to translate brain vital signs into a visual sensory modality. Therefore, the objectives of this study were to: 1) demonstrate the feasibility of visual brain vital signs; and 2) compare and normalize results from visual and auditory brain vital signs. Data were collected from 34 healthy adults (33 ± 13 years) using a 64-channel EEG system. Visual and auditory sequences were kept as comparable as possible to elicit the N100, P300, and N400 responses. Visual brain vital signs were elicited successfully for all three responses across the group (N100: = 29.8380, < 0.001; P300: = 138.8442, < 0.0001; N400: = 6.8476, = 0.01). Initial auditory-visual comparisons across the three components showed attention processing (P300) was found to be the most transferrable across modalities, with no group-level differences and correlated peak amplitudes (rho = 0.7, = 0.0001) across individuals. Auditory P300 latencies were shorter than visual ( < 0.0001) but normalization and correlation ( = 0.5, = 0.0033) implied a potential systematic difference across modalities. Reduced auditory N400 amplitudes compared to visual ( = 0.0061) paired with normalization and correlation across individuals ( = 0.6, = 0.0012), also revealed potential systematic modality differences between reading and listening language comprehension. This study provides an initial understanding of the relationship between the visual and auditory sequences, while importantly establishing a visual sequence within the brain vital signs framework. With both auditory and visual stimulation capabilities available, it is possible to broaden applications across the lifespan.
在医疗现场对脑功能进行快速、客观的生理评估的迫切需求,催生了脑生命体征——一个包含便携式脑电图(EEG)和自动化快速测试方案的框架。该框架能够获取既定的事件相关电位(ERP)标记,这些标记在健康人群和患者群体中对于感觉、注意力和认知功能具有特异性。然而,我们迄今为止所有的应用都使用了听觉刺激,这凸显了在听力障碍者(如老年人、高龄者、痴呆患者)中的应用挑战。因此,将脑生命体征转化为视觉感觉模态变得很重要。所以,本研究的目的是:1)证明视觉脑生命体征的可行性;2)比较并标准化视觉和听觉脑生命体征的结果。使用64通道脑电图系统从34名健康成年人(33±13岁)收集数据。视觉和听觉序列尽可能保持可比,以引出N100、P300和N400反应。在整个组中,所有三种反应的视觉脑生命体征均成功引出(N100: = 29.8380, < 0.001;P300: = 138.8442, < 0.0001;N400: = 6.8476, = 0.01)。对这三个成分进行的初步听觉 - 视觉比较显示,注意力处理(P300)在不同模态间的可转移性最强,在个体间没有组水平差异且峰值幅度相关(rho = 0.7, = 0.0001)。听觉P300潜伏期比视觉短( < 0.0001),但标准化和相关性( = 0.5, = 0.0033)意味着不同模态间存在潜在的系统差异。与视觉相比,听觉N400幅度降低( = 0.0061),同时个体间存在标准化和相关性( = 0.6, = 0.0012),这也揭示了阅读和听力语言理解之间潜在的系统模态差异。本研究初步了解了视觉和听觉序列之间的关系,同时重要的是在脑生命体征框架内建立了视觉序列。具备听觉和视觉刺激能力后,有可能在整个生命周期内拓宽应用范围。