PerFC: An Efficient 2D and 3D Perception Software-Hardware Framework for Mobile Cobot

2Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

In this work, we present an end-to-end software-hardware framework that supports both conventional hardware and software components and integrates machine learning object detectors without requiring an additional dedicated graphic processor unit (GPU). We design our framework to achieve real-time performance on the robot system, guarantee such performance on multiple computing devices, and concentrate on code reusability. We then utilize transfer learning strategies for 2D object detection and fuse them into depth images for 3D depth estimation. Lastly, we test the proposed framework on the Baxter robot with two 7-DOF arms and a four-wheel mobility base. The results show that the robot achieves real-time performance while executing other tasks (map building, localization, navigation, object detection, arm moving, and grasping) with available hardware like Intel onboard GPUs on distributed computers. Also, to comprehensively control, program, and monitor the robot system, we design and introduce an end-user application. The source code is available at https://github.com/tuantdang/perception_framework.

Author supplied keywords

Cite

CITATION STYLE

APA

Dang, T., Nguyen, K., & Huber, M. (2023). PerFC: An Efficient 2D and 3D Perception Software-Hardware Framework for Mobile Cobot. In Proceedings of the International Florida Artificial Intelligence Research Society Conference, FLAIRS (Vol. 36). Florida Online Journals, University of Florida. https://doi.org/10.32473/flairs.36.133316

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free