Human Detection and Motion Analysis from a Quadrotor UAV

17Citations
Citations of this article
32Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This work focuses on detecting humans and estimating their pose and trajectory from an umnanned aerial vehicle (UAV). In our framework, a human detection model is trained using a Region-based Convolutional Neural Network (R-CNN). Each video frame is corrected for perspective using projective transformation. Using Histogram Oriented Gradients (HOG) of the silhouettes as features, the detected human figures are then classified for their pose. A dynamic classifier is developed to estimate forward walking and a turning gait sequence. The estimated poses are used to estimate the shape of the trajectory traversed by the human subject. An average precision of 98% has been achieved for the detector. Experiments conducted on aerial videos confirm our solution can achieve accurate pose and trajectory estimation for different kinds of perspective-distorted videos. For example, for a video recorded at 40m above ground, the perspective correction improves accuracy by 37.1% and 17.8% in pose and viewpoint estimation respectively.

Cite

CITATION STYLE

APA

Perera, A. G., Al-Naji, A., Law, Y. W., & Chahl, J. (2018). Human Detection and Motion Analysis from a Quadrotor UAV. In IOP Conference Series: Materials Science and Engineering (Vol. 405). Institute of Physics Publishing. https://doi.org/10.1088/1757-899X/405/1/012003

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free