BirdViewAR: Surroundings-aware Remote Drone Piloting Using an Augmented Third-person Perspective

8Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We propose BirdViewAR, a surroundings-aware remote drone-operation system that provides significant spatial awareness to pilots through an augmented third-person view (TPV) from an autopiloted secondary follower drone. The follower drone responds to the main drone's motions and directions using our optimization-based autopilot, allowing the pilots to clearly observe the main drone and its imminent destination without extra input. To improve their understanding of the spatial relationships between the main drone and its surroundings, the TPV is visually augmented with AR-overlay graphics, where the main drone's spatial statuses are highlighted: its heading, altitude, ground position, camera field-of-view (FOV), and proximity areas. We discuss BirdViewAR's design and implement its proof-of-concept prototype using programmable drones. Finally, we conduct a preliminary outdoor user study and find that BirdViewAR effectively increased spatial awareness and piloting performance.

Cite

CITATION STYLE

APA

Inoue, M., Takashima, K., Fujita, K., & Kitamura, Y. (2023). BirdViewAR: Surroundings-aware Remote Drone Piloting Using an Augmented Third-person Perspective. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery. https://doi.org/10.1145/3544548.3580681

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free