This paper presents a method to estimate the visual focus of attention from body posture in a system consisting of a head-mounted display and an arm-mounted smartphone. The approach aims at fast and robust detection without using additional hardware, even when walking. Knowledge about the visual focus of attention can be used to adapt user interfaces. E.g. eye-tracking can yield precise measurements for stationary systems. This it is not always possible using mobile devices due to body movement dynamics. A practical solution is achieved through a combination of orientation information and known anatomical limitations. The approach was parameterized and evaluated, reaching mean detection rates of over 97 %. Generalized parameters allow for usage without individual configuration. Used as a screen unlocking mechanism for smartphones, faster access can be realized in comparison to manual unlocking.
CITATION STYLE
Westhoven, M., Plegge, C., Henrich, T., & Alexander, T. (2016). Posture based recognition of the visual focus of attention for adaptive mobile information systems. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9743, pp. 416–427). Springer Verlag. https://doi.org/10.1007/978-3-319-39955-3_39
Mendeley helps you to discover research relevant for your work.