University of Hertfordshire

From the same journal

By the same authors

A Multimodal User Interface for an Assistive Robotic Shopping Cart

Research output: Contribution to journalArticlepeer-review

Documents

  • Dmitry Ryumin
  • Ildar Kagirov
  • Alexandr Axyonov
  • Nikita Pavlyuk
  • Anton Saveliev
  • Irina Kipyatkova
  • Milos Zelezny
  • Iosif Mporas
  • Alexey Karpov
View graph of relations
Original languageEnglish
JournalElectronics
Volume9
Issue12
DOIs
Publication statusPublished - 8 Dec 2020

Abstract

This paper presents the research and development of the prototype of the assistive mobile information robot (AMIR). The main features of the presented prototype are voice and gesture-based interfaces with Russian speech and sign language recognition and synthesis techniques and a high degree of robot autonomy. AMIR prototype’s aim is to be used as a robotic cart for shopping in grocery stores and/or supermarkets. Among the main topics covered in this paper are the presentation of the interface (three modalities), the single-handed gesture recognition system (based on a collected database of Russian sign language elements), as well as the technical description of the robotic platform (architecture, navigation algorithm). The use of multimodal interfaces, namely the speech and gesture modalities, make human-robot interaction natural and intuitive, as well as sign language recognition allows hearing-impaired people to use this robotic cart. AMIR prototype has promising perspectives for real usage in supermarkets, both due to its assistive capabilities and its multimodal user interface.

ID: 23125457