From 2D to 3D mixed reality human-robot interface in hazardous robotic interventions with the use of redundant mobile manipulator

4Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.
Get full text

Abstract

3D Mixed Reality (MR) Human-Robot Interfaces (HRI) show promise for robotic operators to complete tasks more quickly, safely and with less training. The objective of this study is to assess the use of 3D MR HRI environment in comparison with a standard 2D Graphical User Interface (GUI) in order to control a redundant mobile manipulator. The experimental data was taken during operation with a 9 DOF manipulator mounted in a robotized train, CERN Train Inspection Monorail (TIM), used for the Beam Loss Monitor robotic measurement task in a complex hazardous intervention scenario at CERN. The efficiency and workload of an operator were compared with the use of both types of interfaces with NASA TLX method. The usage of heart rate and Galvanic Skin Response parameters for operator condition and stress monitoring was tested. The results show that teleoperation with 3D MR HRI mitigates cognitive fatigue and stress by improving the operators understanding of both the robot's pose and the surrounding environment or scene.

Cite

CITATION STYLE

APA

Szczurek, K. A., Prades, R. M., Matheson, E., Perier, H., Buonocore, L. R., & Castro, M. D. (2021). From 2D to 3D mixed reality human-robot interface in hazardous robotic interventions with the use of redundant mobile manipulator. In Proceedings of the 18th International Conference on Informatics in Control, Automation and Robotics, ICINCO 2021 (pp. 388–395). SciTePress. https://doi.org/10.5220/0010528503880395

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free