Face recognition: from vision to social neuroscience
Face perception involves highly developed visual skills, allowing us to extract a range of information about other people, including age, gender, expression, attractiveness, or identity, among others. Accordingly, data from neuropsychology and neuroimaging in human as well as research in non-human primates have demonstrated that a widely distributed brain network is recruited during face processing. Some parts of this network also overlap with systems engaged by affective and social signals conveyed by non-facial stimuli. However, much still remains unresolved concerning the exact role of different brain areas responding to faces and emotion expressions. This presentation will review recent work from our group and others investigating (by means of fMRI, DTI and EEG) the function and structural interconnection of visual areas and limbic areas involved in processing faces, facial expressions, and other facial features. It will also illustrate new approaches based on multivoxel pattern analysis of brain activations, allowing us to decode distinct information contents from cortical areas activated in fMRI. The latter approach suggests that some areas in temporal and frontal lobe, usually thought to mediate facial expression processing, may have a more general role for supramodal representation of expressed emotions and mental states.