Putting together the puzzle of multisensory perception
We perceive the world surrounding us via multiple sources of sensory information derived from several modalities, including vision, touch and audition. A desirable goal for the perceptual system is to maximize the reliability of the perceptual estimates. From a statistical viewpoint the optimal strategy for achieving this goal is to integrate all available sensory information and to combine it with prior knowledge about statistical regularities in the world. Hence, human perception can be modeled as an ideal observer using the Bayesian approach. In my talk, I will point out that such an optimal mechanism has to take into account not only the reliability of the sensory information but also the certainty in the mapping between these signals. Thus, starting from Maximum-Likelihood-Estimation [1,2,3] we have now developed a Bayesian model for describing multisensory integration using a “Coupling Prior“ that represents the mapping uncertainty between the sensory signals [4, 5]. It seems reasonable to assume that the mapping uncertainty is a function of the spatial and temporal coherence between the different sensory signals. In other words, integrating sensory signals is only reasonable when the sensory maps are in registration. However, when sensory maps are in conflict adaptation occurs, bringing the maps back into registration. We recently showed that also this adaptation process can be described with a derivate of a Bayesian model - a Kalman-Filter . To this end, we investigated how different noise sources affect the rate of adaptation and found that human visuomotor adaptation performance was in good agreement with the Kalman-Filters prediction. In conclusion human perception seems to be largely optimized for a given task, provided the statistical regularities when interacting with the environment.