Augmented reality stereoscopic visualization for intuitive robot teleguide

S. Livatino, G. Muscato, D. De Tommaso, M. Macaluso

Research output: Chapter in Book/Report/Conference proceedingConference contribution

7 Citations (Scopus)
146 Downloads (Pure)

Abstract

This paper proposes a method to simultaneously and coherently present visual and laser sensors information through an augmented reality visualization interface further enhanced by stereoscopic viewing. The use of graphical objects is proposed to represent proximity measurements which are superimposed and suitably aligned to video information through image processing. This new methodology enables an operator to quickly comprehend scene layout and dynamics, and to respond in an accurate and timely manner. Therefore the human-robot interaction is expected to be intuitive, accurate and fast. The use of graphical elements to assist teleoperation, sometime discussed in the literature, is here proposed following a systematic approach and developed based on authors' previous works on stereoscopic teleoperation. The approach is experimented on a real telerobotic system where a user operates a robot located approximately 3,000 kilometers apart. The results of a pilot test were very encouraging. They showed simplicity and effectiveness of the approach proposed and represent a base for further investigations.
Original languageEnglish
Title of host publicationIn: Procs of IEEE International Symposium on Industrial Electronics Art. No. 5636955
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages2828-2833
ISBN (Print)978-1-4244-6390-9
DOIs
Publication statusPublished - 2010

Fingerprint

Dive into the research topics of 'Augmented reality stereoscopic visualization for intuitive robot teleguide'. Together they form a unique fingerprint.

Cite this