Abstract
This article presents a novel framework to register and fuse heterogeneous sensory data. Our approach is based on geometrically registration of sensory data onto a set of virtual parallel planes and then applying an occupancy grid for each layer. This framework is useful in surveillance applications in presence of multi-modal sensors and can be used specially in tracking and human behavior understanding areas. The multi-modal sensors set in this work comprises of some cameras, inertial measurement sensors (IMU), laser range finders (LRF) and a binaural sensing system. For registering data from each one of these sensors an individual approach is proposed. After registering multi-modal sensory data on various geometrically parallel planes, a two-dimensional occupancy grid (as a layer) is applied for each plane. © 2010 Springer Berlin Heidelberg.
Author supplied keywords
Cite
CITATION STYLE
Aliakbarpour, H., Ferreira, J. F., Khoshhal, K., & Dias, J. (2010). A novel framework for data registration and data fusion in presence of multi-modal sensors. IFIP Advances in Information and Communication Technology, 314, 308–315. https://doi.org/10.1007/978-3-642-11628-5_33
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.