Mobile robots have been shown to be helpful in guiding users in complex indoor spaces. While these robots can assist all types of users, current implementations often rely on users visually rendezvousing with the robot, which may be a challenge for people with visual impairments. This paper describes a proof of concept for a robotic system that addresses this kind of short-range rendezvous for users with visual impairments. We propose to use a lattice graph-based Anytime Repairing A(ARA) planner as a global planner to discourage the robot from turning in place at its goal position, making its path more human-like and safer. We also interviewed an Orientation & Mobility (O&M) Specialist for their thoughts on our planner. They observed that our planner produces less obtrusive trajectories to the user than the ROS default global planner and recommended that our system should allow the robot to approach the person from the side as opposed to the front as it currently does. In the future, we plan to test our system with users in-person to better validate our assumptions and find additional pain points.
CITATION STYLE
Limprayoon, J. “fern,” Pareek, P., Tan, X. Z., & Steinfeld, A. (2021). Robot Trajectories When Approaching a User with a Visual Impairment. In ASSETS 2021 - 23rd International ACM SIGACCESS Conference on Computers and Accessibility. Association for Computing Machinery, Inc. https://doi.org/10.1145/3441852.3476538
Mendeley helps you to discover research relevant for your work.