Spatial correlation of multi-sensor features for autonomous victim identification

0Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Robots are used for Urban Search and Rescue to assist rescue workers. To enable the robots to find victims, they are equipped with various sensors including thermal, video and depth time-of-flight cameras, and laser range-finders. We present a method to enable a robot to perform this task autonomously. Thermal features are detected using a dynamic temperature threshold. By aligning the thermal and time-of-flight camera images, the thermal features are projected into 3D space. Edge detection on laser data is used to locate holes within the environment, which are then spatially correlated to the thermal features. A decision tree uses the correlated features to direct the autonomous policy to explore the environment and locate victims. The method was evaluated in the 2010 RoboCup Rescue Real Robots Competition. © 2012 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Wiley, T., McGill, M., Milstein, A., Salleh, R., & Sammut, C. (2012). Spatial correlation of multi-sensor features for autonomous victim identification. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 7416 LNCS, 538–549. https://doi.org/10.1007/978-3-642-32060-6_46

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free