ReflecTouch: Detecting Grasp Posture of Smartphone Using Corneal Reflection Images

8Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

By sensing how a user is holding a smartphone, adaptive user interfaces are possible such as those that automatically switch the displayed content and position of graphical user interface (GUI) components following how the phone is being held. We propose ReflecTouch, a novel method for detecting how a smartphone is being held by capturing images of the smartphone screen reflected on the cornea with a built-in front camera. In these images, the areas where the user places their fingers on the screen appear as shadows, which makes it possible to estimate the grasp posture. Since most smartphones have a front camera, this method can be used regardless of the device model; in addition, no additional sensor or hardware is required. We conducted data collection experiments to verify the classification accuracy of the proposed method for six different grasp postures, and the accuracy was 85%.

Cite

CITATION STYLE

APA

Zhang, X., Ikematsu, K., Kato, K., & Sugiura, Y. (2022). ReflecTouch: Detecting Grasp Posture of Smartphone Using Corneal Reflection Images. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery. https://doi.org/10.1145/3491102.3517440

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free