University of Hertfordshire

From the same journal

By the same authors

  • Ariel Beck
  • Lola Cañamero
  • Antoine Hiolle
  • Luisa Damiano
  • Piero Cosi
  • Fabio Tesser
  • Giacomo Sommavilla
View graph of relations
Original languageEnglish
Number of pages10
Pages (from-to)325-334
JournalInternational Journal of Social Robotics
Journal publication dateAug 2013
Volume5
Issue3
DOIs
Publication statusPublished - Aug 2013

Abstract

The work reported in this paper focuses on giving humanoid robots the capacity to express emotions with their body. Previous results show that adults are able to interpret different key poses displayed by a humanoid robot and also
that changing the head position affects the expressiveness of the key poses in a consistent way. Moving the head down leads to decreased arousal (the level of energy) and valence (positive or negative emotion) whereas moving the head up
produces an increase along these dimensions. Hence, changing the head position during an interaction should send intuitive signals. The study reported in this paper tested children’s ability to recognize the emotional body language displayed by a humanoid robot. The results suggest that body postures and head position can be used to convey emotions during child-robot interaction.

Notes

This work is funded by the EU FP7 ALIZ-E project (grant number 248116).

Projects

ID: 2152919