Sliding window mapping for omnidirectional RGB-D sensors

2Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.

Abstract

This paper presents an omnidirectional RGB-D (RGB + Distance fusion) sensor prototype using an actuated LIDAR (Light Detection and Ranging) and an RGB camera. Besides the sensor, a novel mapping strategy is developed considering sensor scanning characteristics. The sensor can gather RGB and 3D data from any direction by toppling in 90 degrees a laser scan sensor and rotating it about its central axis. The mapping strategy is based on two environment maps, a local map for instantaneous perception, and a global map for perception memory. The 2D local map represents the surface in front of the robot and may contain RGB data, allowing environment reconstruction and human detection, similar to a sliding window that moves with a robot and stores surface data.

Cite

CITATION STYLE

APA

Dalmedico, N., Teixeira, M. A. S., Santos, H. B., Nogueira, R. de C. M., de Arruda, L. V. R., Neves, F., … de Oliveira, A. S. (2019). Sliding window mapping for omnidirectional RGB-D sensors. Sensors (Switzerland), 19(23). https://doi.org/10.3390/s19235121

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free