Disaster response robots are expected to perform complicated tasks such as traveling over unstable terrain, climbing slippery steps, and removing heavy debris. To complete such tasks safely, the robots must obtain not only visual-perceptual information (VPI) such as surface shape but also the haptic-perceptual information (HPI) such as surface friction of objects in the environments. VPI can be obtained from laser sensors and cameras. In contrast, HPI can be basically obtained from only the results of physical interaction with the environments, e.g., reaction force and deformation. However, current robots do not have a function to estimate the HPI. In this study, we propose a framework to estimate such physically interactive parameters (PIPs), including hardness, friction, and weight, which are vital parameters for safe robot-environment interaction. For effective estimation, we define the ground (GGM) and object groping modes (OGM). The endpoint of the robot arm, which has a force sensor, actively touches, pushes, rubs, and lifts objects in the environment with a hybrid position/force control, and three kinds of PIPs are estimated from the measured reaction force and displacement of the arm endpoint. The robot finally judges the accident risk based on estimated PIPs, e.g., safe, attentional, or dangerous. We prepared environments that had the same surface shape but different hardness, friction, and weight. The experimental results indicated that the proposed framework could estimate PIPs adequately and was useful to judge the risk and safely plan tasks.
CITATION STYLE
Kamezaki, M., Uehara, Y., Azuma, K., & Sugano, S. (2021). A framework of physically interactive parameter estimation based on active environmental groping for safe disaster response work. ROBOMECH Journal, 8(1). https://doi.org/10.1186/s40648-021-00209-1
Mendeley helps you to discover research relevant for your work.