Decoding the ‘listening brain’: Brain reading of auditory percepts in human with functional neuroimaging
In the last years, non-invasive functional neuroimaging studies have accumulated relevant information on the representation of sounds in the human brain. These studies have suggested a certain degree of functional specialization in the auditory cortical fields for elementary (e.g. frequency, amplitude, modulation) or higher order (e.g. categories) sound attributes. However, only limited information is available on how our brain utilizes these attributes in order to form ‘perceptual’ representations of the incoming acoustic input, beyond its contingent sensory implementation. In this talk, I will present a series of fMRI studies investigating how our brain builds a perceptual description of the auditory world. The first study distinguishes ‘perceptual’ responses in early auditory areas by exploiting a well-known psychoacoustic phenomenon, the auditory continuity illusion, in which there is discrepancy between the physical (sensory) input and the subject’s percept. The other studies use methods of statistical pattern recognition (‘brain reading’) to explore the role of the auditory areas in building ‘invariant’ representations of sounds from various categories, including speech and voice.