TY - GEN
T1 - A preliminary framework for a social robot “sixth sense”
AU - Cominelli, Lorenzo
AU - Mazzei, Daniele
AU - Carbonaro, Nicola
AU - Garofalo, Roberto
AU - Zaraki, Abolfazl
AU - Tognetti, Alessandro
AU - de Rossi, Danilo
N1 - Funding Information:
This work was partially funded by the European Commission under the 7th Framework Program projects EASEL, Expressive Agents for Symbiotic Education and Learning, under Grant 611971-FP7- ICT-2013-10. Special thanks to Daniela Gasperini for her fundamental contribution in the experiments organization.
Publisher Copyright:
© Springer International Publishing Switzerland 2016.
PY - 2016
Y1 - 2016
N2 - Building a social robot that is able to interact naturally with people is a challenging task that becomes even more ambitious if the robots’ interlocutors are children involved in crowded scenarios like a classroom or a museum. In such scenarios, the main concern is enabling the robot to track the subjects’ social and affective state modulating its behaviour on the basis of the engagement and the emotional state of its interlocutors. To reach this goal, the robot needs to gather visual and auditory data, but also to acquire physiological signals, which are fundamental for understating the interlocutors’ psycho-physiological state. Following this purpose, several Human-Robot Interaction (HRI) frameworks have been proposed in the last years, although most of them have been based on the use of wearable sensors. However, wearable equipments are not the best technology for acquisition in crowded multi-party environments for obvious reasons (e.g., all the subjects should be prepared before the experiment by wearing the acquisition devices). Furthermore, wearable sensors, also if designed to be minimally intrusive, add an extra factor to the HRI scenarios, introducing a bias in the measurements due to psychological stress. In order to overcome this limitations, in this work, we present an unobtrusive method to acquire both visual and physiological signals from multiple subjects involved in HRI. The system is able to integrate acquired data and associate them with unique subjects’ IDs. The implemented system has been tested with the FACE humanoid in order to assess integrated devices and algorithms technical features. Preliminary tests demonstrated that the developed system can be used for extending the FACE perception capabilities giving it a sort of sixth sense that will improve the robot empathic and behavioural capabilities.
AB - Building a social robot that is able to interact naturally with people is a challenging task that becomes even more ambitious if the robots’ interlocutors are children involved in crowded scenarios like a classroom or a museum. In such scenarios, the main concern is enabling the robot to track the subjects’ social and affective state modulating its behaviour on the basis of the engagement and the emotional state of its interlocutors. To reach this goal, the robot needs to gather visual and auditory data, but also to acquire physiological signals, which are fundamental for understating the interlocutors’ psycho-physiological state. Following this purpose, several Human-Robot Interaction (HRI) frameworks have been proposed in the last years, although most of them have been based on the use of wearable sensors. However, wearable equipments are not the best technology for acquisition in crowded multi-party environments for obvious reasons (e.g., all the subjects should be prepared before the experiment by wearing the acquisition devices). Furthermore, wearable sensors, also if designed to be minimally intrusive, add an extra factor to the HRI scenarios, introducing a bias in the measurements due to psychological stress. In order to overcome this limitations, in this work, we present an unobtrusive method to acquire both visual and physiological signals from multiple subjects involved in HRI. The system is able to integrate acquired data and associate them with unique subjects’ IDs. The implemented system has been tested with the FACE humanoid in order to assess integrated devices and algorithms technical features. Preliminary tests demonstrated that the developed system can be used for extending the FACE perception capabilities giving it a sort of sixth sense that will improve the robot empathic and behavioural capabilities.
KW - Affective computing
KW - Behaviour monitoring
KW - Human-Robot interaction
KW - Social robotics
KW - Synthetic tutor
UR - http://www.scopus.com/inward/record.url?scp=84978901196&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-42417-0_6
DO - 10.1007/978-3-319-42417-0_6
M3 - Conference contribution
AN - SCOPUS:84978901196
SN - 9783319424163
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 58
EP - 70
BT - Biomimetic and Biohybrid Systems - 5th International Conference, Living Machines 2016, Proceedings
A2 - Lepora, Nathan F.
A2 - Mura, Anna
A2 - Desmulliez, Marc
A2 - Mangan, Michael
A2 - Verschure, Paul F.M.J.
A2 - Prescott, Tony J.
PB - Springer Nature Link
T2 - 5th International Conference on Biomimetic and Biohybrid Systems, Living Machines 2016
Y2 - 19 July 2016 through 22 July 2016
ER -