How do monkeys and humans see real-world objects? exploring representational similarity across species
Single-cell recordings have shown that primate inferior temporal (IT) neurons respond selectively to visual features occurring in natural images as parts of objects. Neuroimaging research has demonstrated that information on object membership in human conventional categories is present in focal activations as well as in widely distributed response patterns. Neuroimaging, however, has focused on analyzing category-average responses, begging the question if an inherent categorical structure falls out of the representations and, if so, what the inherent categories are. A recent single-cell recording study demonstrates that monkey-IT response patterns in fact cluster according to natural categories (Kiani R et al 2007, J Neurophysiol). Here we study response patterns elicited by the same 92 object images in human and monkey IT (as measured with fMRI and single-cell recording, respectively). We analyze human and monkey representations by means of “representational similarity analysis”, which allows us (1) to combine evidence across brain space and experimental conditions to sensitively detect neuronal pattern information and (2) to relate results (a) between different modalities of brain-activity measurement (fMRI, cell recording), (b) between different species, and (c) between brain-activity data and computational models of brain information processing. We find a close match between human and monkey IT response-pattern dissimilarity matrices. Primate IT response patterns appear to cluster in natural categories, with the animate-inanimate distinction explaining most variance and faces forming a subcluster within the animates. Early visual responses show no such categorical clustering, nor do the internal representations of a range of computational models when exposed to our stimuli. Our results suggest that determining membership in certain behaviorally crucial categories constitutes a fundamental function of primate IT across species. The close match also provides hope that data from single-cell recording and fMRI, for all their differences, may similarly reveal neuronal representations when subjected to massively multivariate analyses of response-pattern information.