Intuitive User Interfaces for Real-Time Magnification in Augmented Reality

2Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

Abstract

Various reasons exist why humans desire to magnify portions of our visually perceived surroundings, e.g., because they are too far away or too small to see with the naked eye. Different technologies are used to facilitate magnification, from telescopes to microscopes using monocular or binocular designs. In particular, modern digital cameras capable of optical and/or digital zoom are very flexible as their high-resolution imagery can be presented to users in real-time with displays and interfaces allowing control over the magnification. In this paper, we present a novel design space of intuitive augmented reality (AR) magnifications where an AR head-mounted display is used for the presentation of real-time magnified camera imagery. We present a user study evaluating and comparing different visual presentation methods and AR interaction techniques. Our results show different advantages for unimanual, bimanual, and situated AR magnification window interfaces, near versus far vergence distances for the image presentation, and five different user interfaces for specifying the scaling factor of the imagery.

Cite

CITATION STYLE

APA

Schubert, R., Bruder, G., & Welch, G. (2023). Intuitive User Interfaces for Real-Time Magnification in Augmented Reality. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST. Association for Computing Machinery. https://doi.org/10.1145/3611659.3615694

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free