A couple of challenges for visual perception under uncertainty
Visual perception is often seen as an inference problem where uncertainty comes from ambiguities in the world (e.g. the two 3D interpretations of the Necker cube), noise in the world (e.g. identifying a scene behind falling snowflakes), or noise in the visual system (e.g. synaptic noise). To deal with ambiguities, the visual system relies on prior knowledge such as the assumption that light comes from above our head. We have measured the characteristics of this assumption and found a systematic bias to the left for the illumination direction preference. In addition, using fMRI coupled with a model of behavioural performance, we found evidence that the above-left assumption for the light source position was encoded early in the visual system and processed in a bottom-up way. To deal with noise, the visual system relies on the integration of information in space and time. Usually such an integration over a well-chosen region helps increase the signal-to-noise ratio. However, there are situations where integration can be highly detrimental. We report conditions where stereo-acuity drops by two orders of magnitude when the elements over which depth is compared belong to a single object. These results can be well accounted for by a model where object uncertainty propagates to all its parts. In summary, uncertainty is an unavoidable aspect of visual perception and the search for the solutions found by the visual system to cope with it is a fundamental part of the investigation of visual perception.