Maritime Tracking with Georeferenced Multi-Camera Fusion

6Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Cameras form an essential part of any autonomous surface vehicle's sensor package, both for COLREGs compliance to detect light signals and for identifying and tracking other vessels. Due to limited fields of view compared to more traditional autonomy sensors such as lidars and radars, an autonomous surface vessel will typically be equipped with multiple cameras which can induce biases when used in tracking if a target is present in multiple image frames. In this work, we propose a novel pipeline for camera-based maritime tracking that combines georeferencing with clustering-based multi-camera fusion for bias-free camera measurements with target range estimates. Using real-world datasets collected using the milliAmpere research platform the performance of this pipeline exceeded a lidar benchmark across multiple performance measures, both in pure detection performance and as part of a JIPDA-based tracking system.

Cite

CITATION STYLE

APA

Helgesen, O. K., Stahl, A., & Brekke, E. F. (2023). Maritime Tracking with Georeferenced Multi-Camera Fusion. IEEE Access, 11, 30340–30359. https://doi.org/10.1109/ACCESS.2023.3261556

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free