Robot self-calibration using actuated 3D sensors

4Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Both robot and hand-eye calibration have been object of research for decades. While current approaches manage to precisely and robustly identify the parameters of a robot's kinematic model, they still rely on external devices such as calibration objects, markers and/or external sensors. Instead of trying to fit recorded measurements to a model of a known object, this paper treats robot calibration as an offline SLAM problem, where scanning poses are linked to a fixed point in space via a moving kinematic chain. As such, we enable robot calibration by using nothing but an arbitrary eye-in-hand depth sensor. To the authors' best knowledge the presented framework is the first solution to three-dimensional (3D) sensor-based robot calibration that does not require external sensors nor reference objects. Our novel approach utilizes a modified version of the Iterative Corresponding Point algorithm to run bundle adjustment on multiple 3D recordings estimating the optimal parameters of the kinematic model. A detailed evaluation of the system is shown on a real robot with various attached 3D sensors. The presented results show that the system reaches precision comparable to a dedicated external tracking system at a fraction of its cost.

Cite

CITATION STYLE

APA

Peters, A., & Knoll, A. C. (2024). Robot self-calibration using actuated 3D sensors. Journal of Field Robotics, 41(2), 327–346. https://doi.org/10.1002/rob.22259

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free