Interactive segmentation of textured and textureless objects

2Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This article describes interactive object segmentation for autonomous service robots acting in human living environments. The proposed system allows a robot to effectively segment textured and textureless objects in cluttered scenes by leveraging its manipulation capabilities. In this interactive perception approach, RGB and depth (RGB-D) camera based features are tracked while the robot actively induces motions into a scene using its arm. The robot autonomously infers appropriate arm movements which can effectively separate objects. The resulting tracked feature trajectories are assigned to their corresponding object by clustering. In the final step, we reconstruct the dense models of the objects from the previously clustered sparse RGB-D features. The approach is integrated with robotic grasping and is demonstrated on scenes consisting of various textured and textureless objects, showing the advantages of a tight integration between perception, cognition and action.

Cite

CITATION STYLE

APA

Hausman, K., Pangercic, D., Márton, Z. C., Bálint-Benczédi, F., Bersch, C., Gupta, M., … Beetz, M. (2015). Interactive segmentation of textured and textureless objects. In Studies in Systems, Decision and Control (Vol. 42, pp. 237–262). Springer International Publishing. https://doi.org/10.1007/978-3-319-26327-4_10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free