Third point of view augmented reality for robot intentions visualization

25Citations
Citations of this article
31Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Lightweight, head-up displays integrated in industrial helmets allow to provide contextual information for industrial scenarios such as in maintenance. Moving from single display and single camera solutions to stereo perception and display opens new interaction possibilities. In particular this paper addresses the case of information sharing by a Baxter robot displayed to the user overlooking at the real scene. System design and interaction ideas are being presented.

Cite

CITATION STYLE

APA

Ruffaldi, E., Brizzi, F., Tecchia, F., & Bacinelli, S. (2016). Third point of view augmented reality for robot intentions visualization. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9768, pp. 471–478). Springer Verlag. https://doi.org/10.1007/978-3-319-40621-3_35

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free