Crossmodal compensation in deafness : from neuronal bases in monkeys to functional brain imaging in humans
Each sensory modality (vision, hearing, touch) are integrated at the cortical level in separated areas, each specialized in specific aspects of information processing (movement, spatiality…). However, in primates several regions in the frontal, parietal and temporal lobes are defined as polysensory as they are able to integrate simultaneously information of different modalities. Multisensory integration is expressed at the behavioral level by perceptual improvements and at the neuronal level by a potentialisation of neuronal activity when stimuli of different modality are combined. While multisensory integration is classically assigned to higher hierarchical cortical areas, recent human imaging studies have surprisingly revealed that multisensory interactions can occur in areas known until now as unimodal. First, to determine the cortical network involved in multisensory interactions, we performed multiple injections of different retrograde tracers in unimodal cortical areas of the monkey and observed several of heteromodal connections linking directly unimodal sensory areas such as a direct pathway from the primary auditory area to the primary visual cortex. To investigate the functional role of this A1 to V1 heteromodal connection, we searched for an influence of an auditory stimulus on visual responses of V1 neurons in a behaving monkey. We show that during a guided saccade directed toward a visual or visuo-auditory cue, V1 neurons present a significant reduction in response latencies in bimodal compared to visual conditions. Taken together, our results provide evidence for multisensory integration at low levels of information processing and argue against a strict hierarchical model. Furthermore, such results have important consequences for understanding cross-modal compensation mechanisms that occur following sensory deprivation. Crossmodal plasticity was lastly analyzed in cochlear implanted deaf subject. Cochlear implants (CI) are neuroprostheses designed to restore speech perception in case of profound hearing loss. We first confirm that a profound hearing loss induces the acquisition of strong speech-reading abilities but we present evidence that this skill remains unaffected by the recovery of the auditory functions provided by the neuro-prosthesis. In consequence we show that cochlear implanted users present higher visuo-auditory performance when compared to normally-hearing with similar auditory stimuli. Our data show that cochlear implanted deaf subject have developed specific strategies of speech comprehension which is accompanied by a progressive reorganization of the cortical networks involved in audiovisual speech integration, within a short time period of few months after implantation.