This paper presents a framework for multimodal human-robot interaction. The proposed framework is intended to bring important contributions to thedevelopment of human robot interaction to facilitate intuitive programming and to enable easily adapting to changes in robot task without the need of using skilled personnel. The key elements of this system are speech and hand gesture recognition, text programming, and interaction capabilities that allow the user to take over the control of the robot at any given time. Furthermore, our approach is focused on robot tasks. A user can express his/her preference for one or more modalities of interaction so that selected modalities fit user’s personal needs.
CITATION STYLE
Mocan, B., Fulea, M., & Brad, S. (2016). Designing a multimodal human-robot interaction interface for an industrial robot. In Advances in Intelligent Systems and Computing (Vol. 371, pp. 255–263). Springer Verlag. https://doi.org/10.1007/978-3-319-21290-6_26
Mendeley helps you to discover research relevant for your work.