Advancing Emotionally Aware Child–Robot Interaction with Biophysical Data and Insight-Driven Affective Computing

Diego Resende Faria, Amie Louise Godkin, Pedro Paulo da Silva Ayrosa

Research output: Contribution to journalArticlepeer-review

10 Downloads (Pure)

Abstract

This paper investigates the integration of affective computing techniques using biophysical data to advance emotionally aware machines and enhance child–robot interaction (CRI). By leveraging interdisciplinary insights from neuroscience, psychology, and artificial intelligence, the study focuses on creating adaptive, emotion-aware systems capable of dynamically recognizing and responding to human emotional states. Through a real-world CRI pilot study involving the NAO robot, this research demonstrates how facial expression analysis and speech emotion recognition can be employed to detect and address negative emotions in real time, fostering positive emotional engagement. The emotion recognition system combines handcrafted and deep learning features for facial expressions, achieving an 85% classification accuracy during real-time CRI, while speech emotions are analyzed using acoustic features processed through machine learning models with an 83% accuracy rate. Offline evaluation of the combined emotion dataset using a Dynamic Bayesian Mixture Model (DBMM) achieved a 92% accuracy for facial expressions, and the multilingual speech dataset yielded 98% accuracy for speech emotions using the DBMM ensemble. Observations from psychological and technological aspects, coupled with statistical analysis, reveal the robot’s ability to transition negative emotions into neutral or positive states in most cases, contributing to emotional regulation in children. This work underscores the potential of emotion-aware robots to support therapeutic and educational interventions, particularly for pediatric populations, while setting a foundation for developing personalized and empathetic human–machine interactions. These findings demonstrate the transformative role of affective computing in bridging the gap between technological functionality and emotional intelligence across diverse domains.
Original languageEnglish
Article number1161
Pages (from-to)1-28
Number of pages28
JournalSensors
Volume25
Issue number4
Early online date14 Feb 2025
DOIs
Publication statusPublished - 14 Feb 2025

Keywords

  • affective computing
  • child–robot interaction
  • emotion-aware technology
  • Artificial Intelligence
  • Humans
  • Facial Expression
  • Male
  • Emotions/physiology
  • Machine Learning
  • Speech/physiology
  • Bayes Theorem
  • Female
  • Child
  • Robotics/methods

Fingerprint

Dive into the research topics of 'Advancing Emotionally Aware Child–Robot Interaction with Biophysical Data and Insight-Driven Affective Computing'. Together they form a unique fingerprint.

Cite this