Through-Ice Acoustic Source Tracking Using Vision Transformers with Ordinal Classification

1Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

Ice environments pose challenges for conventional underwater acoustic localization techniques due to theirmultipath and non-linear nature. In this paper, we compare different deep learning networks, such as Transformers, Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM) networks, and Vision Transformers (ViTs), for passive localization and tracking of single moving, on-ice acoustic sources using two underwater acoustic vector sensors. We incorporate ordinal classification as a localization approach and compare the results with other standard methods. We conduct experiments passively recording the acoustic signature of an anthropogenic source on the ice and analyze these data. The results demonstrate that Vision Transformers are a strong contender for tracking moving acoustic sources on ice. Additionally, we show that classification as a localization technique can outperform regression for networks more suited for classification, such as the CNN and ViTs.

Cite

CITATION STYLE

APA

Whitaker, S., Barnard, A., Anderson, G. D., & Havens, T. C. (2022). Through-Ice Acoustic Source Tracking Using Vision Transformers with Ordinal Classification. Sensors, 22(13). https://doi.org/10.3390/s22134703

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free