A hierarchical vision-based UAV localization for an open landing

23Citations
Citations of this article
21Readers
Mendeley users who have this article in their library.

Abstract

The localization of unmanned aerial vehicles (UAVs) for autonomous landing is challenging because the relative positions of the landing objects are almost inaccessible and the objects have nearly no transmission with UAVs. In this paper, a hierarchical vision-based localization framework for rotor UAVs is proposed for an open landing. In such a hierarchical framework, the landing is defined into three phases: “Approaching”, “Adjustment”, and “Touchdown”. Object features at different scales can be extracted from a designed Robust and Quick Response Landing Pattern (RQRLP) and the corresponding detection and localization methods are introduced for the three phases. Then a federated Extended Kalman Filter (EKF) structure is costumed and utilizes the solutions of the three phases as independent measurements to estimate the pose of the vehicle. The framework can be used to integrate the vision solutions and enables the estimation to be smooth and robust. In the end, several typical field experiments have been carried out to verify the proposed hierarchical vision framework. It can be seen that a wider localization range can be extended by the proposed framework while the precision is ensured.

Cite

CITATION STYLE

APA

Yuan, H., Xiao, C., Xiu, S., Zhan, W., Ye, Z., Zhang, F., … Li, Q. (2018). A hierarchical vision-based UAV localization for an open landing. Electronics (Switzerland), 7(5). https://doi.org/10.3390/electronics7050068

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free