Observers predict actions from facial emotional expressions during real-time social interactions


In face-to-face social interactions, emotional expressions provide insights into the mental state of an interactive partner. This information can be crucial to infer action intentions and react adaptively towards another person’s actions. Here we investigate how facial emotional expressions impact subjective experience and physiological and behavioral responses to social actions during real-time interactions. Thirty-two participants interacted with virtual agents while fully immersed in Virtual Reality. Agents displayed an angry or happy facial expression before they directed an appetitive (fist bump) or aversive (punch) social action towards the participant. Participants responded to these actions, either by reciprocating the fist bump or by defending the punch. For all interactions, subjective experience was measured using ratings. In addition, physiological responses (electrodermal activity, electrocardiogram) and participants’ response times were recorded. Aversive actions were judged to be more arousing and less pleasant relative to appetitive actions. In addition, angry expressions increased heart rate relative to happy expressions. Crucially, interaction effects between facial emotion and action were observed. Angry expressions reduced pleasantness stronger for appetitive compared to aversive actions. Furthermore, skin conductance responses to aversive actions were increased for happy compared to angry expressions and reaction times were faster to aversive compared to appetitive actions when agents showed an angry expression. These data indicate that observers used facial emotional expression to generate expectations for particular actions. Consequently, the present study demonstrates that observers integrate information from facial emotional expressions with actions during social interactions.