Abstract
The role of gaze in interaction has been an area of increasing interest to the field of human-robot interaction. Mutual gaze, the pattern of behavior that arises when humans look directly at each other's faces, sends important social cues communicating attention and personality traits and helping to regulate conversational turn-taking. In preparation for learning a computational model of mutual gaze that can be used as a controller for a robot, data from human-human pairs in a conversational task was collected using a gaze-tracking system and face-tracking algorithm. The overall amount of mutual gaze observed between pairs agreed with predictions from the psychology literature. But the duration of mutual gaze was shorter than predicted, and the amount of direct eye contact detected was, surprisingly, almost nonexistent. The results presented show the potential of this automated method to capture detailed information about human gaze behavior, and future applications for interaction-based robot language learning are discussed. The analysis of human-human mutual gaze using automated tracking allows further testing and extension of past results that relied on hand-coding and can provide both a method of data collection and input for control of interactive robots.
Original language | English |
---|---|
Title of host publication | ACHI 2011 - 4th International Conference on Advances in Computer-Human Interactions |
Pages | 222-227 |
Number of pages | 6 |
Publication status | Published - 1 Dec 2011 |
Event | 4th International Conference on Advances in Computer-Human Interactions, ACHI 2011 - Gosier, Guadeloupe, France Duration: 23 Feb 2011 → 28 Feb 2011 |
Conference
Conference | 4th International Conference on Advances in Computer-Human Interactions, ACHI 2011 |
---|---|
Country/Territory | France |
City | Gosier, Guadeloupe |
Period | 23/02/11 → 28/02/11 |
Keywords
- Human-robot interaction
- Markov model
- Mutual gaze
- Psychology