In this paper, we discuss a robot vision in order to perceive humans and the environment around a mobile robot. We developed a tele-operated mobile robot with a pan-tilt mechanism composed of a camera and a laser range finder (LRF). The output from the camera is color information, and the output of LRF is distance information to objects from the robot Tn this paper, we propose a method of sensor fusion to extract a human from the measured data by integrating these outputs based on the concept of synthesis. Finally, we show experimental results of the proposed method. © Springer-Verlag Berlin Heidelberg 2007.
CITATION STYLE
Kubota, N., Satomi, M., Taniguchi, K., & Nogawa, Y. (2007). Human three-dimensional modeling based on intelligent sensor fusion for a tele-operated mobile robot. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4694 LNAI, pp. 98–106). Springer Verlag. https://doi.org/10.1007/978-3-540-74829-8_13
Mendeley helps you to discover research relevant for your work.