Human mobile robot interaction in the retail environment

24Citations
Citations of this article
35Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

As technology advances, Human-Robot Interaction (HRI) is boosting overall system efficiency and productivity. However, allowing robots to be present closely with humans will inevitably put higher demands on precise human motion tracking and prediction. Datasets that contain both humans and robots operating in the shared space are receiving growing attention as they may facilitate a variety of robotics and human-systems research. Datasets that track HRI with rich information other than video images during daily activities are rarely seen. In this paper, we introduce a novel dataset that focuses on social navigation between humans and robots in a future-oriented Wholesale and Retail Trade (WRT) environment (https://uf-retail-cobot-dataset.github.io/). Eight participants performed the tasks that are commonly undertaken by consumers and retail workers. More than 260 minutes of data were collected, including robot and human trajectories, human full-body motion capture, eye gaze directions, and other contextual information. Comprehensive descriptions of each category of data stream, as well as potential use cases are included. Furthermore, analysis with multiple data sources and future directions are discussed.

Cite

CITATION STYLE

APA

Chen, Y., Luo, Y., Yang, C., Yerebakan, M. O., Hao, S., Grimaldi, N., … Hu, B. (2022). Human mobile robot interaction in the retail environment. Scientific Data, 9(1). https://doi.org/10.1038/s41597-022-01802-8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free