Deep learning locally trained wildlife sensing in real acoustic wetland environment

4Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We describe ‘Tidzam’, an application of deep learning that leverages a dense, multimodal sensor network installed at a large wetland restoration performed at Tidmarsh, a 600-acre former industrial-scale cranberry farm in Southern Massachusetts. Wildlife acoustic monitoring is a crucial metric during post-restoration evaluation of the processes, as well as a challenge in such a noisy outdoor environment. This article presents the entire Tidzam system, which has been designed in order to identify in real-time the ambient sounds of weather conditions as well as sonic events such as insects, small animals and local bird species from microphones deployed on the site. This experiment provides insight on the usage of deep learning technology in a real deployment. The originality of this work concerns the system’s ability to construct its own database from local audio sampling under the supervision of human visitors and bird experts.

Cite

CITATION STYLE

APA

Duhart, C., Dublon, G., Mayton, B., & Paradiso, J. (2019). Deep learning locally trained wildlife sensing in real acoustic wetland environment. In Communications in Computer and Information Science (Vol. 968, pp. 3–14). Springer Verlag. https://doi.org/10.1007/978-981-13-5758-9_1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free