Abstract
Mobile robotic systems have evolved to include sensors capable of truthfully describing robot status and operating environment as accurately and reliably as never before. This possibility is challenged by effective sensor data exploitation, because of the cognitive load an operator is exposed to, due to the large amount of data and time-dependency constraints. This paper addresses this challenge in remote-vehicle teleoperation by proposing an intuitive way to present sensor data to users by means of using mixed reality and visual aids within the user interface. We propose a method for organizing information presentation and a set of visual aids to facilitate visual communication of data in teleoperation control panels. The resulting sensor-information presentation appears coherent and intuitive, making it easier for an operator to catch and comprehend information meaning. This increases situational awareness and speeds up decision-making. Our method is implemented on a real mobile robotic system operating outdoor equipped with on-board internal and external sensors, GPS, and a reconstructed 3D graphical model provided by an assistant drone. Experimentation verified feasibility while intuitive and comprehensive visual communication was confirmed through an assessment, which encourages further developments.
Original language | English |
---|---|
Pages (from-to) | 25795 - 25808 |
Number of pages | 14 |
Journal | IEEE Access |
Volume | 9 |
DOIs | |
Publication status | Published - 8 Feb 2021 |
Keywords
- Virtual reality and interfaces
- mixed reality
- human-robot interface
- head mounted display
- robot teleoperation
- stereo vision
- field robotics
- telerobotics
- human-robot interaction
- graphical user interfaces
- augmented reality
- user interfaces
- Virtual reality