Visual and auditory motion information can be used together to provide complementary information about the movement of objects. To investigate the neural substrates of such cross-modal integration, functional magnetic resonance imaging was used to assess brain activation while subjects performed separate visual and auditory motion discrimination tasks. Areas of unimodal activation included the primary and/or early sensory cortex for each modality plus additional sites extending toward parietal cortex. Areas conjointly activated by both tasks included lateral parietal cortex, lateral frontal cortex, anterior midline and anterior insular cortex. The parietal site encompassed distinct, but partially overlapping, zones of activation in or near the intraparietal sulcus (IPS). A subsequent task requiring an explicit cross-modal speed comparison revealed several foci of enhanced activity relative to the unimodal tasks. These included the IPS, anterior midline, and anterior insula but not frontal cortex. During the unimodal auditory motion task, portions of the dorsal visual motion system showed signals depressed below resting baseline. Thus, interactions between the two systems involved either enhancement or suppression depending on the stimuli present and the nature of the perceptual task. Together these results identify human cortical regions involved in polysensory integration and the attentional selection of cross-modal motion information.
Mendeley saves you time finding and organizing research
Choose a citation style from the tabs below