In Human-Computer Interactions (HCI), to reduce the dependency of bulky devices like physical keyboards and joysticks, many gesture-based HCI schemes are adopted. As a typical HCI technology, text input has aroused much concern and many virtual or wearable keyboards have been proposed. To further remove the keyboard and allow people to type in a device-free way, we propose AirTyping, i.e., a mid-air typing scheme based on Leap Motion. During the typing process, the Leap Motion Controller captures the typing gestures with cameras and provides the coordinates of finger joints. Then, AirTyping detects the possible keystrokes, infers the typed words based on Bayesian method, and outputs the inputted word sequence. The experiment results show that our system can detect the keystrokes and infer the typed text efficiently, i.e., the true positive rate of keystroke detection is 92.2%, while the accuracy that the top-1 inferred word is the typed word achieves 90.2%.
CITATION STYLE
Zhang, H., Yin, Y., Xie, L., & Lu, S. (2020). AirTyping: A mid-air typing scheme based on leap motion. In UbiComp/ISWC 2020 Adjunct - Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers (pp. 168–171). Association for Computing Machinery. https://doi.org/10.1145/3410530.3414387
Mendeley helps you to discover research relevant for your work.