美国和法国研究人员发布报告说,人的视觉系统能利用周遭声音以看得更清楚,听觉系统能处理外界光线以听得更明白。
研究人员以猴子为实验对象,训练它们寻找屏幕上的闪光。光线明亮时,猴子比较容易找准屏幕上的闪光区域。光线昏暗时,猴子需要耗费较长时间才能完成任务。然而,当昏暗光线与短促声音同时出现时,猴子便能迅速确定屏幕上的闪光区域。
研究人员对49个负责处理初级图像的神经元加以研究,发现这些细胞在“听”到声音时反应较快,与“看”到明亮光线时的反应相似。研究人员说,这表明大脑负责视与听的两套系统存在直接联系。
法国研究人员帕斯卡尔·巴龙说,人和一些动物的感官细胞可以有选择地处理不同知觉。盲人不能利用视觉系统去看,却可以用它去“听”。因此,盲人往往听力较好,聋人大多视力较佳。
这项研究颠覆了传统理论。研究人员先前认为,人的视觉系统处理图像,听觉系统处理声音,两套系统“井水不犯河水”,而由大脑综合图像和声音,使我们获得“有声有色”的人生。(生物谷Bioon.com)
生物谷推荐原始出处:
BMC Neuroscience 2008, 9:79doi:10.1186/1471-2202-9-79
Visuo-auditory interactions in the primary visual cortex of the behaving monkey. Electrophysiological evidence.
Ye Wang , Simona Celebrini , Yves Trotter and Pascal Barone
Background
Visual, tactile and auditory information is processed from the periphery to the cortical level through separate channels that target primary sensory cortices, from which it is further distributed to functionally specialized areas. Multisensory integration is classically assigned to higher hierarchical cortical areas, but there is growing electrophysiological evidence in man and monkey of multimodal interactions in areas thought to be unimodal, interactions that can occur at very short latencies. Such fast timing of multisensory interactions rules out the possibility of an origin in the polymodal areas mediated through back projections, but is rather in favor of heteromodal connections such as the direct projections observed in the monkey, from auditory areas directly to the primary visual cortex V1. Based on the existence of such AI to V1 projections, we looked for modulation of neuronal visual responses in V1 by an auditory stimulus in the awake behaving monkey.
Results
Behavioral or electrophysiological data were obtained from two behaving monkeys. One monkey was trained to maintain a passive central fixation while a peripheral visual (V) or visuo-auditory (AV) stimulus was presented. From a population of 45 V1 neurons, there was no difference in the mean latencies or strength of visual responses when comparing V and AV conditions. In a second active task, the monkey was required to orient his gaze toward the visual or visuo-auditory stimulus. From a population of 49 cells recorded during this saccadic task, we observed a significant reduction in response latencies in the visuo-auditory condition compared to the visual condition (mean 61.0 vs. 64.5ms) only when the visual stimulus was at midlevel contrast. No effect was observed at high contrast.
Conclusion
Our data show that single neurons from a primary sensory cortex such as V1 can integrate sensory information of a different modality, a result that argues against a strict hierarchical model of multisensory integration. Multisensory integration in V1 is, in our experiment, expressed by a significant reduction in visual response latencies specifically in suboptimal conditions and depending on the task demand. This suggests that neuronal mechanisms of multisensory integration are specific and adapted to the perceptual features of behavior.