With age, changes to the human nervous system lead to a decrease in control accuracy of extremities. Especially, reduced control over the fingers gravely affects a person’s quality of life and self-reliance. It is possible to recover the ability to accurately control the amount of power applied with one’s fingers through training with the Pressing Evaluation Training System (PETS). When training with the PETS users have to focus on guidance provided on a monitor and lose sight of their fingers. This could lead to increased mental workload and reduced training efficiency. In this paper we explore if presenting the guidelines closer to the user’s fingers provides better guidance to the user, thus improving the performance results. In particular, we use a video-see-through head-mounted display to present guidance next to the user’s fingers through augmented reality (AR), and a haptic device to replicate the tasks during PETS training. We test our implementation with 18 university students. Although the results of our study indicate that presenting information closer to the interaction area does not improve the performance, several participants preferred guidance presented in AR.
CITATION STYLE
Plopski, A., Mori, R., Taketomi, T., Sandor, C., & Kato, H. (2018). AR-PETS: Development of an augmented reality supported pressing evaluation training system. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10927 LNCS, pp. 113–126). Springer Verlag. https://doi.org/10.1007/978-3-319-92037-5_10
Mendeley helps you to discover research relevant for your work.