Roopkotha: A Companion Robot for Enhancing Interactive Storytelling with Natural Interaction

Kazi Mayesha Mehzabin, Md Zahidul Islam, Md Ashaduzzaman Nur, Mohammad Shidujaman, Hooman Samani, Haipeng Mi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Roopkotha is a storytelling robot that seamlessly combines traditional storytelling methods and technology, creating a captivating robot storyteller. We are creating a special prototype in the world of robots that tells stories in a way that is easy to understand and enjoy. In this era of technological advancement, Roopkotha combines voice recognition with Bangla Language processing, emotion recognition, human behavior detection. Roopkotha aims to revolutionize the way stories are told and engage with users on a deep emotional level. Furthermore, Roopkotha is equipped with advanced facial expressions and emotion recognition technology. The emotion recognition feature helps the robot to have a profound connection with the users.

Original languageEnglish
Title of host publication3rd International Conference on Image Processing and Robotics, ICIPRoB 2024 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
ISBN (Electronic)9798350374766
DOIs
Publication statusPublished - 2024
Event3rd International Conference on Image Processing and Robotics, ICIPRoB 2024 - Hybrid, Colombo, Sri Lanka
Duration: 9 Mar 202410 Mar 2024

Publication series

Name3rd International Conference on Image Processing and Robotics, ICIPRoB 2024 - Proceedings

Conference

Conference3rd International Conference on Image Processing and Robotics, ICIPRoB 2024
Country/TerritorySri Lanka
CityHybrid, Colombo
Period9/03/2410/03/24

Keywords

  • Bangla Language
  • Emotion Recognition
  • Facial Expression
  • Human Behavior
  • Human-Robot Interaction

Fingerprint

Dive into the research topics of 'Roopkotha: A Companion Robot for Enhancing Interactive Storytelling with Natural Interaction'. Together they form a unique fingerprint.

Cite this