When a robot perceives its environment, it is not only important to know what kind of objects are present in it, but also how they relate to each other. For example in a cleanup task in a cluttered environment, a sensible strategy is to pick the objects with the least contacts to other objects first, to minimize the chance of unwanted movements not related to the current picking action. Estimating object contacts in cluttered scenes only based on passive observation is a complex problem. To tackle this problem, we present a deep neural network that learns physically stable object relations directly from geometric features. The learned relations are encoded as contact graphs between the objects. To facilitate training of the network, we generated a rich, publicly available dataset consisting of more than 25000 unique contact scenes, by utilizing a physics simulation. Different deep architectures have been evaluated and the final architecture, which shows good results in reconstructing contact graphs, is evaluated quantitatively and qualitatively.
CITATION STYLE
Meier, M., Haschke, R., & Ritter, H. J. (2020). From Geometries to Contact Graphs. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12397 LNCS, pp. 546–555). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-61616-8_44
Mendeley helps you to discover research relevant for your work.