Interpretation of emotional body language displayed by robots

A. Beck, A. Hiolle, A. Mazel, Lola Cañamero

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    40 Citations (Scopus)

    Abstract

    In order for robots to be socially accepted and generate empathy they must display emotions. For robots such as Nao, body language is the best medium available, as they do not have the ability to display facial expressions. Displaying emotional body language that can be interpreted whilst interacting with the robot should greatly improve its acceptance. This research investigates the creation of an "Affect Space" [1] for the generation of emotional body language that could be displayed by robots. An Affect Space is generated by "blending" (i.e. interpolating between) different emotional expressions to create new ones. An Affect Space for body language based on the Circumplex Model of emotions [2] has been created. The experiment reported in this paper investigated the perception of specific key poses from the Affect Space. The results suggest that this Affect Space for body expressions can be used to improve the expressiveness of humanoid robots. In addition, early results of a pilot study are described. It revealed that the context helps human subjects improve their recognition rate during a human-robot imitation game, and in turn this recognition leads to better outcome of the interactions.
    Original languageEnglish
    Title of host publicationProcs of the 3rd ACM Workshop on Affective Interaction in Natural Environments, Co-located with ACM Multimedia 2010
    PublisherInstitute of Electrical and Electronics Engineers (IEEE)
    Pages37-42
    DOIs
    Publication statusPublished - 2010

    Keywords

    • Robotics
    • artificial intelligence
    • computing methodologies

    Fingerprint

    Dive into the research topics of 'Interpretation of emotional body language displayed by robots'. Together they form a unique fingerprint.

    Cite this