University of Hertfordshire

From the same journal

By the same authors

View graph of relations
Original languageEnglish
Article number6602436
Pages (from-to)90-97
Number of pages8
JournalIEEE Symposium on Artificial Life (ALIFE)
Volume2013-January
IssueJanuary
DOIs
Publication statusPublished - 1 Jan 2013
Event4th IEEE International Symposium on Artificial Life, IEEE ALIFE 2013 - Singapore, Singapore
Duration: 16 Apr 201319 Apr 2013

Abstract

This paper presents a study of the readability of doginspired visual communication signals in a human-robot interaction scenario. This study was motivated by specially trained hearing dogs which provide assistance to their deaf owners by using visual communication signals to lead them to the sound source. For our human-robot interaction scenario, a robot was used in place of a hearing dog to lead participants to two different sound sources. The robot was preprogrammed with dog-inspired behaviors, controlled by a wizard who directly implemented the dog behavioral strategy on the robot during the trial. By using dog-inspired visual communication signals as a means of communication, the robot was able to lead participants to the sound sources (the microwave door, the front door). Findings indicate that untrained participants could correctly interpret the robot's intentions. Head movements and gaze directions were important for communicating the robot's intention using visual communication signals.

ID: 18675035