Interpretation of Emotional Body Language Displayed by a Humanoid Robot: A Case Study with Children

Ariel Beck, Lola Cañamero, Antoine Hiolle, Luisa Damiano, Piero Cosi, Fabio Tesser, Giacomo Sommavilla

Research output: Contribution to journalArticlepeer-review

44 Citations (Scopus)


The work reported in this paper focuses on giving humanoid robots the capacity to express emotions with their body. Previous results show that adults are able to interpret different key poses displayed by a humanoid robot and also
that changing the head position affects the expressiveness of the key poses in a consistent way. Moving the head down leads to decreased arousal (the level of energy) and valence (positive or negative emotion) whereas moving the head up
produces an increase along these dimensions. Hence, changing the head position during an interaction should send intuitive signals. The study reported in this paper tested children’s ability to recognize the emotional body language displayed by a humanoid robot. The results suggest that body postures and head position can be used to convey emotions during child-robot interaction.
Original languageEnglish
Pages (from-to)325-334
Number of pages10
JournalInternational Journal of Social Robotics
Issue number3
Publication statusPublished - Aug 2013


  • emotion, social robotics, emotional body language, perception


Dive into the research topics of 'Interpretation of Emotional Body Language Displayed by a Humanoid Robot: A Case Study with Children'. Together they form a unique fingerprint.

Cite this