AR Point&Click: An Interface for Setting Robot Navigation Goals

3Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper considers the problem of designating navigation goal locations for interactive mobile robots. We investigate a point-and-click interface, implemented with an Augmented Reality (AR) headset. The cameras on the AR headset are used to detect natural pointing gestures performed by the user. The selected goal is visualized through the AR headset, allowing the users to adjust the goal location if desired. We conduct a user study in which participants set consecutive navigation goals for the robot using three different interfaces: AR Point&Click, Person Following and Tablet (birdeye map view). Results show that the proposed AR Point&Click interface improved the perceived accuracy, efficiency and reduced mental load compared to the baseline tablet interface, and it performed on-par to the Person Following method. These results show that the AR Point&Click is a feasible interaction model for setting navigation goals.

Cite

CITATION STYLE

APA

Gu, M., Croft, E., & Cosgun, A. (2022). AR Point&Click: An Interface for Setting Robot Navigation Goals. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13817 LNAI, pp. 38–49). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-24667-8_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free