Self-localization of unmanned aerial vehicles based on optical flow in onboard camera images

6Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper proposes and evaluates the implementation of a self-localization system intended for use in Unmanned Aerial Vehicles (UAVs). Accurate localization is necessary for UAVs for efficient stabilization, navigation and collision avoidance. Conventionally, this requirement is fulfilled using external hardware infrastructure, such as Global Navigation Satellite System (GNSS) or camera-based motion capture system (VICON-like [37]). These approaches are, however, not applicable in environments where deployment of cumbersome motion capture equipment is not feasible, as well as in GNSS-denied environments. Systems based on Simultaneous Localization and Mapping (SLAM) require heavy and expensive onboard equipment and high amounts of data transmissions for sharing maps between UAVs. Availability of a system without these drawbacks is crucial for deployment of tight formations of multiple fully autonomous micro UAVs for both outdoor and indoor missions. The project was inspired by the often used sensor PX4FLOW Smart Camera [12]. The aim was to develop a similar sensor, but without the multiple drawbacks observed in its use, as well as to make the operation of it more transparent and to make it independent of a specific hardware. Our proposed solution requires only a lightweight camera and a single-point range sensor. It is based on optical flow estimation from consecutive images obtained from downward-facing camera, coupled with a specialized RANSAC-inspired post-processing method that takes into account flight dynamics. This filtering makes it more robust against imperfect lighting, homogenous ground patches, random close objects and spurious errors. These features make this approach suitable even for coordinated flights through demanding forest-like environment. The system is designed mainly for horizontal velocity estimation, but specialized modifications were also made for vertical speed and yaw rotation rate estimation. These methods were tested in a simulator and subsequently in real world conditions. The tests showed, that the sensor is suitably reliable and accurate to be usable in practice.

Cite

CITATION STYLE

APA

Walter, V., Novák, T., & Saska, M. (2018). Self-localization of unmanned aerial vehicles based on optical flow in onboard camera images. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10756 LNCS, pp. 106–132). Springer Verlag. https://doi.org/10.1007/978-3-319-76072-8_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free