Wearable computers have seen a recent resurgence in interest and popularity in which augmented reality (AR) smartglasses are poised to influence the way we complete our work and daily tasks. Nowadays, industrial applications of these smartglasses are focused on interior designs, remote collaborations, as well as e-commerce. Under five key constraints on AR smartglasses such as miniature touch interface, small-screen real estate, user mobility, limited computational resource, and short battery life, existing user interaction paradigms designed for desktop computers and smartphones are obsolete and incompatible with the scenarios of AR smartglasses. The cumbersome and difficult interaction with the AR smartglasses becomes a hurdle to their wider industrial applications. Thus, an unmet demand for designing interaction techniques on AR smartglasses is undoubtedly critical. In this chapter, we present three interaction techniques, namely, TiPoint, HIBEY, and TOFI, in order to enhance object manipulation and text entry in the constrained environment of AR smartglasses. These techniques are devised in a way that leverage on advantageous features of human body and experiences such as the dexterity of fingertip, lexicographical order ingrained in our memory, proprioception, as well as opposable thumbs. We thoroughly address the key constraints on AR smartglasses and explore different modalities with various hardware and peripherals.
CITATION STYLE
Lee, L. H., Braud, T., & Hui, P. (2023). Embodied Interaction on Constrained Interfaces for Augmented Reality. In Springer Handbooks (pp. 239–271). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-67822-7_10
Mendeley helps you to discover research relevant for your work.