We present a method for interactively generating virtual fixtures for shared teleoperation in unstructured remote environments. The proposed method allows a human operator to intuitively assign various types of virtual fixtures on-the-fly to provide virtual guidance forces helping the operator to accomplish a given task while minimizing the cognitive workload. The proposed method augments the visual feedback image from the slave’s robot video camera with automatically extracted geometric features (shapes, surfaces, etc.) computed from both depth and color video sensor attached next to the slave robot’s base. The human operator can select a feature on the computer screen which is then automatically associated with a virtual haptic fixture. The performance of the proposed method was evaluated with a peg-in-hole task and the experiment showed improvements in teleoperation performance.
CITATION STYLE
Pruks, V., Farkhatdinov, I., & Ryu, J. H. (2018). Preliminary Study on Real-Time Interactive Virtual Fixture Generation Method for Shared Teleoperation in Unstructured Environments. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10894 LNCS, pp. 648–659). Springer Verlag. https://doi.org/10.1007/978-3-319-93399-3_55
Mendeley helps you to discover research relevant for your work.