Laboratory of Cognitive Neuroscience, Brain Mind Institute, Faculty of Life Sciences, Swiss Federal Institute of Technology, Geneva 1202, Switzerland,
Center for Neuroprosthetics, Faculty of Life Sciences, Swiss Federal Institute of Technology, Geneva 1202, Switzerland.
J Neurosci. 2018 Jan 10;38(2):263-277. doi: 10.1523/JNEUROSCI.0322-17.2017. Epub 2017 Sep 15.
Human metacognition, or the capacity to introspect on one's own mental states, has been mostly characterized through confidence reports in visual tasks. A pressing question is to what extent results from visual studies generalize to other domains. Answering this question allows determining whether metacognition operates through shared, supramodal mechanisms or through idiosyncratic, modality-specific mechanisms. Here, we report three new lines of evidence for decisional and postdecisional mechanisms arguing for the supramodality of metacognition. First, metacognitive efficiency correlated among auditory, tactile, visual, and audiovisual tasks. Second, confidence in an audiovisual task was best modeled using supramodal formats based on integrated representations of auditory and visual signals. Third, confidence in correct responses involved similar electrophysiological markers for visual and audiovisual tasks that are associated with motor preparation preceding the perceptual judgment. We conclude that the supramodality of metacognition relies on supramodal confidence estimates and decisional signals that are shared across sensory modalities. Metacognitive monitoring is the capacity to access, report, and regulate one's own mental states. In perception, this allows rating our confidence in what we have seen, heard, or touched. Although metacognitive monitoring can operate on different cognitive domains, we ignore whether it involves a single supramodal mechanism common to multiple cognitive domains or modality-specific mechanisms idiosyncratic to each domain. Here, we bring evidence in favor of the supramodality hypothesis by showing that participants with high metacognitive performance in one modality are likely to perform well in other modalities. Based on computational modeling and electrophysiology, we propose that supramodality can be explained by the existence of supramodal confidence estimates and by the influence of decisional cues on confidence estimates.
人类元认知,即对自身心理状态进行内省的能力,主要通过视觉任务中的置信度报告来描述。一个紧迫的问题是,视觉研究的结果在多大程度上推广到其他领域。回答这个问题可以确定元认知是否通过共享的、超模式的机制运作,还是通过独特的、模态特定的机制运作。在这里,我们报告了三个新的证据,用于决策和决策后机制,这些证据支持元认知的超模式。首先,听觉、触觉、视觉和视听任务中的元认知效率存在相关性。其次,使用基于听觉和视觉信号综合表示的超模式格式来对视听任务中的置信度进行最佳建模。第三,正确响应的置信度涉及与感知判断之前的运动准备相关的类似电生理标记,这些标记适用于视觉和视听任务。我们的结论是,元认知的超模式依赖于跨感觉模式共享的超模式置信度估计和决策信号。元认知监测是访问、报告和调节自己心理状态的能力。在感知中,这使我们能够对我们所看到、听到或触摸到的内容的信心进行评分。尽管元认知监测可以在不同的认知领域运作,但我们忽略了它是否涉及到一个适用于多个认知领域的单一超模式机制,或者是每个领域特有的独特机制。在这里,我们通过证明在一种模态中具有高元认知表现的参与者很可能在其他模态中表现良好,为超模式假设提供了证据。基于计算建模和电生理学,我们提出超模式可以通过存在超模式置信度估计和决策线索对置信度估计的影响来解释。