Abstract
Handwriting recognition is improving in leaps and bounds, and this opens up new opportunities for stylus-based interactions. In particular, note-taking applications can become a more intelligent user interface, incorporating new features like autocomplete and integrated search. In this work we ran a gesture elicitation study, asking 21 participants to imagine how they would interact with an imaginary, intelligent note-taking application. Participants were prompted to produce gestures for common actions such as select and delete, as well as less common actions (for gesture interaction) such as autocomplete accept/reject, 'hide', and search. We report agreement on the elicited gestures, finding that while existing interactions are prevalent (like double taps and long presses) a number of more novel interactions (like dragging selected items to hotspots or using annotations) were also well-represented. We discuss the mental models participants drew on when explaining their gestures and what kind of feedback users might need to move to more stylus-centric interactions.
Author supplied keywords
Cite
CITATION STYLE
Gero, K. I., Chilton, L., Melancon, C., & Cleron, M. (2022). Eliciting Gestures for Novel Note-taking Interactions. In DIS 2022 - Proceedings of the 2022 ACM Designing Interactive Systems Conference: Digital Wellbeing (pp. 966–975). Association for Computing Machinery, Inc. https://doi.org/10.1145/3532106.3533480
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.