Prediction in Multisensory Emotional Speech
Social interactions rely on verbal and non-verbal information sources and their interaction. Crucially, in such communicative interactions we can obtain information about the current emotional state of others. However, emotion expressions are not always clear cut or may be influenced by a specific situational context or learned knowledge. In our work on the temporal and neural correlates of multimodal emotion expressions we address a number of questions by means of ERPs, fMRI, and lesion studies. Within a prediction framework I will focus on the following aspects in my talk: (1) How do we integrate different verbal and non-verbal emotion expressions, (2) How do cognitive demands impact the processing of multimodal emotion expressions, (3) How do we resolve interferences between verbal and non-verbal emotion expressions?