Vision and learning for deliberative monocular cluttered flight

25Citations
Citations of this article
76Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Cameras provide a rich source of information while being passive, cheap and lightweight for small Unmanned AerialVehicles (UAVs). In thisworkwe present the first implementation of receding horizon control, which is widely used in ground vehicles, with monocular vision as the only sensing mode for autonomous UAV flight in dense clutter. Two key contributions make this possible: novel coupling of perception and control via relevant and diverse, multiple interpretations of the scene around the robot, leveraging recent advances in machine learning to showcase anytime budgeted cost-sensitive feature selection, and fast non-linear regression for monocular depth prediction. We empirically demonstrate the efficacy of our novel.

Cite

CITATION STYLE

APA

Dey, D., Shankar, K. S., Zeng, S., Mehta, R., Agcayazi, M. T., Eriksen, C., … Bagnell, J. A. (2016). Vision and learning for deliberative monocular cluttered flight. In Springer Tracts in Advanced Robotics (Vol. 113, pp. 391–409). Springer Verlag. https://doi.org/10.1007/978-3-319-27702-8_26

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free