Towards Visible and Thermal Drone Monitoring with Convolutional Neural Networks

28Citations
Citations of this article
50Readers
Mendeley users who have this article in their library.

Abstract

This paper reports a visible and thermal drone monitoring system that integrates deep-learning-based detection and tracking modules. The biggest challenge in adopting deep learning methods for drone detection is the paucity of training drone images especially thermal drone images. To address this issue, we develop two data augmentation techniques. One is a model-based drone augmentation technique that automatically generates visible drone images with a bounding box label on the drone's location. The other is exploiting an adversarial data augmentation methodology to create thermal drone images. To track a small flying drone, we utilize the residual information between consecutive image frames. Finally, we present an integrated detection and tracking system that outperforms the performance of each individual module containing detection or tracking only. The experiments show that, even being trained on synthetic data, the proposed system performs well on real-world drone images with complex background. The USC drone detection and tracking dataset with user labeled bounding boxes is available to the public.

Cite

CITATION STYLE

APA

Wang, Y., Chen, Y., Choi, J., & Kuo, C. C. J. (2019). Towards Visible and Thermal Drone Monitoring with Convolutional Neural Networks. APSIPA Transactions on Signal and Information Processing, 8. https://doi.org/10.1017/ATSIP.2018.30

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free