Ouchi Tomohiro, Scholl Leo R, Rajeswaran Pavithra, Canfield Ryan A, Smith Lydia I, Orsborn Amy L
University of Washington, Electrical and Computer Engineering, Seattle, 98115, USA.
University of Washington, Bioengineering, Seattle, 98115, USA.
bioRxiv. 2024 Aug 16:2024.08.13.607846. doi: 10.1101/2024.08.13.607846.
Goal-directed reaches give rise to dynamic neural activity across the brain as we move our eyes and arms, and process outcomes. High spatiotemporal resolution mapping of multiple cortical areas will improve our understanding of how these neural computations are spatially and temporally distributed across the brain. In this study, we used micro-electrocorticography (μECoG) recordings in two male monkeys performing visually guided reaches to map information related to eye movements, arm movements, and receiving rewards over a 1.37 cm area of frontal motor cortices (primary motor cortex, premotor cortex, frontal eye field, and dorsolateral pre-frontal cortex). Time-frequency and decoding analyses revealed that eye and arm movement information shifts across brain regions during a reach, likely reflecting shifts from planning to execution. We then used phase-based analyses to reveal potential overlaps of eye and arm information. We found that arm movement decoding performance was impacted by task-irrelevant eye movements, consistent with the presence of intermixed eye and arm information across much of motor cortices. Phase-based analyses also identified reward-related activity primarily around the principal sulcus in the pre-frontal cortex as well as near the arcuate sulcus in the premotor cortex. Our results demonstrate μECoG's strengths for functional mapping and provide further detail on the spatial distribution of eye, arm, and reward information processing distributed across frontal cortices during reaching. These insights advance our understanding of the overlapping neural computations underlying coordinated movements and reveal opportunities to leverage these signals to enhance future brain-computer interfaces.
当我们移动眼睛和手臂并处理结果时,目标导向的伸手动作会在整个大脑中引发动态神经活动。对多个皮质区域进行高时空分辨率映射,将有助于我们更好地理解这些神经计算在大脑中是如何在空间和时间上分布的。在本研究中,我们对两只雄性猴子进行了微电极皮层脑电图(μECoG)记录,它们执行视觉引导的伸手动作,以绘制与眼球运动、手臂运动以及在额叶运动皮层(初级运动皮层、运动前皮层、额叶眼区和背外侧前额叶皮层)1.37平方厘米区域内接收奖励相关的信息。时频分析和解码分析表明,在伸手过程中,眼球和手臂运动信息在不同脑区之间转移,这可能反映了从计划到执行的转变。然后,我们使用基于相位的分析来揭示眼球和手臂信息的潜在重叠。我们发现,手臂运动解码性能受到与任务无关的眼球运动的影响,这与在大部分运动皮层中存在混合的眼球和手臂信息一致。基于相位的分析还确定了奖励相关活动主要围绕前额叶皮层的主沟以及运动前皮层的弓状沟附近。我们的结果证明了μECoG在功能映射方面的优势,并进一步详细说明了在伸手过程中分布于额叶皮层的眼球、手臂和奖励信息处理的空间分布。这些见解推进了我们对协调运动背后重叠神经计算的理解,并揭示了利用这些信号来增强未来脑机接口的机会。