A low-level active vision framework for collaborative unmanned aircraft systems

10Citations
Citations of this article
21Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Micro unmanned aerial vehicles are becoming increasingly interesting for aiding and collaborating with human agents in myriads of applications, but in particular they are useful for monitoring inaccessible or dangerous areas. In order to interact with and monitor humans, these systems need robust and real-time computer vision subsystems that allow to detect and follow persons. In this work, we propose a low-level active vision framework to accomplish these challenging tasks. Based on the LinkQuad platform, we present a system study that implements the detection and tracking of people under fully autonomous flight conditions, keeping the vehicle within a certain distance of a person. The framework integrates state-of-the-art methods from visual detection and tracking, Bayesian filtering, and AIbased control. The results from our experiments clearly suggest that the proposed framework performs real-time detection and tracking of persons in complex scenarios.

Cite

CITATION STYLE

APA

Danelljan, M., Khan, F. S., Felsberg, M., Granström, K., Heintz, F., Rudol, P., … Doherty, P. (2015). A low-level active vision framework for collaborative unmanned aircraft systems. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8925, pp. 223–237). Springer Verlag. https://doi.org/10.1007/978-3-319-16178-5_15

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free