University of Hertfordshire

By the same authors

Towards an Affect Space for robots to display emotional body language

Research output: Chapter in Book/Report/Conference proceedingConference contribution

View graph of relations
Original languageEnglish
Title of host publicationProcs of the 19th IEEE Int Symposium on Robot and Human Interactive Communication, RO-MAN
PublisherIEEE
Pages464-469
ISBN (Print)978-1-4244-7990-0
DOIs
Publication statusPublished - 2010
Event19th IEEE Int Symposium on Robot and Human Interactive Communication - Viareggio, Italy
Duration: 12 Sep 201015 Sep 2010

Conference

Conference19th IEEE Int Symposium on Robot and Human Interactive Communication
CountryItaly
CityViareggio
Period12/09/1015/09/10

Abstract

In order for robots to be socially accepted and generate empathy it is necessary that they display rich emotions. For robots such as Nao, body language is the best medium available given their inability to convey facial expressions. Displaying emotional body language that can be interpreted whilst interacting with the robot should significantly improve its sociability. This research investigates the creation of an Affect Space for the generation of emotional body language to be displayed by robots. To create an Affect Space for body language, one has to establish the contribution of the different positions of the joints to the emotional expression. The experiment reported in this paper investigated the effect of varying a robot's head position on the interpretation, Valence, Arousal and Stance of emotional key poses. It was found that participants were better than chance level in interpreting the key poses. This finding confirms that body language is an appropriate medium for robot to express emotions. Moreover, the results of this study support the conclusion that Head Position is an important body posture variable. Head Position up increased correct identification for some emotion displays (pride, happiness, and excitement), whereas Head Position down increased correct identification for other displays (anger, sadness). Fear, however, was identified well regardless of Head Position. Head up was always evaluated as more highly Aroused than Head straight or down. Evaluations of Valence (degree of negativity to positivity) and Stance (degree to which the robot was aversive to approaching), however, depended on both Head Position and the emotion displayed. The effects of varying this single body posture variable were complex.

Notes

Original article can be found at: http://ieeexplore.ieee.org “This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder." “Copyright IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.”

ID: 429718