Visual coding of dynamic facial expressions
Nicholas Furl is a Senior Investigator Scientist at the MRC Cognition and Brain Sciences Unit in Cambridge where he leads an ESRC-funded project to study perception of dynamic faces using brain imaging in collaboration with Andy Calder, Rik Henson and Karl Friston. Studies of high-level vision typically rely on static images. Dynamic stimuli, however, are not only more ecologically valid but also better address the computational challenges a visual system must face when producing invariant perception across complex stimulus changes. Moreover, social conspecific actions such as facial expressions are defined by movements, and yet the mechanisms motion representation involved in action perception are poorly understood. There may be important, but yet unknown, differences between the visual codes for dynamic and static faces in the brain. Moreover, there may be more prominent roles in face perception than previously believed for motion-sensitive areas, such as the hMT+/V5 complex and posterior superior temporal sulcus in the human. Nicholas will describe fMRI data from the monkey showing a role for motion-sensitive areas in representation of both dynamic and static facial expressions. He will also show results from connectivity modeling of human fMRI data showing that amygdala feedback can alter how fearful expressions are visually coded, depending on whether the faces are dynamic or static. These experiments demonstrate that motion representations are important for face perception and suggest systematic differences in how dynamic and static facial expressions are visually coded.