There is increasing interest in using robots in simulation to understand and improve human-robot interaction (HRI). At the same time, the use of simulated settings to gather training data promises to help address a major data bottleneck in allowing robots to take advantage of powerful machine learning approaches. In this paper, we describe a prototype system that combines the robot operating system (ROS), the simulator Gazebo, and the Unity game engine to create human-robot interaction scenarios. A person can engage with the scenario using a monitor wall, allowing simultaneous collection of realistic sensor data and traces of human actions.
CITATION STYLE
Murnane, M., Breitmeyer, M., Ferraro, F., Matuszek, C., & Engel, D. (2019). Learning from human-robot interactions in modeled scenes. In ACM SIGGRAPH 2019 Posters, SIGGRAPH 2019. Association for Computing Machinery, Inc. https://doi.org/10.1145/3306214.3338546
Mendeley helps you to discover research relevant for your work.