TY - GEN
T1 - Active affective facial analysis for human-robot interaction
AU - Ge, Shuzhi Sam
AU - Samani, Hooman Aghaebrahimi
AU - Ong, Yin Hao Janus
AU - Hang, Chang Chieh
PY - 2008
Y1 - 2008
N2 - In this paper, we present an active vision system for human-robot interaction purposes that includes robust face detection, tracking, recognition and facial expression analysis. The system will search for human faces in view, zoom on the face of interest based on the face recognition database, track it and finally analyze the emotion parameters on the face. After detection using Haar-cascade classifiers, the variable parameters of the camera are changed adaptively to track the face of the subject by employing the Camshift algorithm, and to extract the facial features which are used for face recognition and facial expression analysis. Embedded Hidden Markov Model is used for face recognition and nonlinear facial mass-spring model is employed to describe the facial muscle's tension. The motion signatures are then classified using Multi-layer Perceptrons for facial expression analysis. This system can be used as a comprehensive and robust vision package for a robot to interact with human beings.
AB - In this paper, we present an active vision system for human-robot interaction purposes that includes robust face detection, tracking, recognition and facial expression analysis. The system will search for human faces in view, zoom on the face of interest based on the face recognition database, track it and finally analyze the emotion parameters on the face. After detection using Haar-cascade classifiers, the variable parameters of the camera are changed adaptively to track the face of the subject by employing the Camshift algorithm, and to extract the facial features which are used for face recognition and facial expression analysis. Embedded Hidden Markov Model is used for face recognition and nonlinear facial mass-spring model is employed to describe the facial muscle's tension. The motion signatures are then classified using Multi-layer Perceptrons for facial expression analysis. This system can be used as a comprehensive and robust vision package for a robot to interact with human beings.
KW - Emotion recognition
KW - Facial expression recognition
KW - Facial feature analysis
KW - Human-robot interaction
KW - Social robot
UR - http://www.scopus.com/inward/record.url?scp=52949138721&partnerID=8YFLogxK
U2 - 10.1109/ROMAN.2008.4600647
DO - 10.1109/ROMAN.2008.4600647
M3 - Conference contribution
AN - SCOPUS:52949138721
SN - 9781424422135
T3 - Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN
SP - 83
EP - 88
BT - Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN
T2 - 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN
Y2 - 1 August 2008 through 3 August 2008
ER -