Accurate Underwater ATR in Forward-Looking Sonar Imagery Using Deep Convolutional Neural Networks

53Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Underwater automatic target recognition (ATR) is a challenging task for marine robots due to the complex environment. The existing recognition methods basically use hand-crafted features and classifiers to recognize targets, which are difficult to achieve ideal recognition accuracy. In this paper, we proposed a novel method to realize accurate multiclass underwater ATR by using forward-looking sonar-Echoscope and deep convolutional neural networks (DCNNs). A complete recognition process from data preprocessing to network training and image recognition was realized. Firstly, we established a real, measured Echoscope sonar image dataset. Inspired by the human visual attention mechanism, the suspected target region was extracted via the graph-based manifold ranking method in image preprocessing. Secondly, an end-to-end DCNNs model, named EchoNet, was designed for Echoscope sonar image feature extraction and recognition. Finally, a network training method based on transfer learning was developed to solve the problem of insufficient training data, and mini-batch gradient descent was used for network optimization. Experimental results demonstrated that our method can implement efficiently, and the recognition accuracy on a nine-class underwater ATR task reached 97.3%, outperforming traditional feature-based methods. The proposed method is expected to be a potential novel technology for the intelligent perception of autonomous underwater vehicles.

Cite

CITATION STYLE

APA

Jin, L., Liang, H., & Yang, C. (2019). Accurate Underwater ATR in Forward-Looking Sonar Imagery Using Deep Convolutional Neural Networks. IEEE Access, 7, 125522–125531. https://doi.org/10.1109/ACCESS.2019.2939005

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free