University of Hertfordshire

From the same journal

By the same authors

Intuitive Robot Teleoperation through Multi-Sensor Informed Mixed Reality Visual Aids

Research output: Contribution to journalArticlepeer-review


  • Salvatore Livatino
  • Dario C. Guastella
  • Giovanni Muscato
  • Vincenzo Rinaldi
  • Luciano Cantelli
  • Carmelo D. Melita
  • Alessandro Caniglia
  • Riccardo Mazza
  • Gianluca Padula
View graph of relations
Original languageEnglish
Pages (from-to)25795 - 25808
Number of pages14
JournalIEEE Access
Publication statusPublished - 8 Feb 2021


Mobile robotic systems have evolved to include sensors capable of truthfully describing robot status and operating environment as accurately and reliably as never before. This possibility is challenged by effective sensor data exploitation, because of the cognitive load an operator is exposed to, due to the large amount of data and time-dependency constraints. This paper addresses this challenge in remote-vehicle teleoperation by proposing an intuitive way to present sensor data to users by means of using mixed reality and visual aids within the user interface. We propose a method for organizing information presentation and a set of visual aids to facilitate visual communication of data in teleoperation control panels. The resulting sensor-information presentation appears coherent and intuitive, making it easier for an operator to catch and comprehend information meaning. This increases situational awareness and speeds up decision-making. Our method is implemented on a real mobile robotic system operating outdoor equipped with on-board internal and external sensors, GPS, and a reconstructed 3D graphical model provided by an assistant drone. Experimentation verified feasibility while intuitive and comprehensive visual communication was confirmed through an assessment, which encourages further developments.


© 2021 The Author(s). This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see

ID: 19694561