The reconstruction of facial expressions and body postures in embodied systems: new approaches to an old problem
In human machine interaction one of the aims is to integrate a computational model of emotional processes within the architecture of embodied systems (interactive virtual characters). Although the implementation of emotion processes in such systems is highly sophisticated, none of theses projects offers a convincing solution for the action expression problem (i.e. mapping behavioural output on emotions). One can assume that facial expression and body posture is the emotional core information a user may get from a system. However, the implementation of such a system is far from being trivial, because it requires in depth theoretical and methodological considerations. Here I present a new approach for constructing control architectures for facial expression and body posture simulation based on the assumption that emotions are discrete and compare this with a simple circumplexmodel of emotion.