Keytch: Combining the keyboard with a touchscreen for rapid command selection on toolbars

5Citations
Citations of this article
26Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Many of today's most used desktop applications (e.g. Ofce or Adobe suites) rely on the use of simple or multi-level toolbars (also called ribbons, i.e. a set of toolbars placed on several tabs), usually located on one edge of the application window. However, such widgets occupy screen real estate and, more importantly, traditional interaction with toolbars requires users to move the mouse pointer from the working object (e.g. text document inWord) to the toolbar area, and then come back to the working object to continue their initial task. This is referred to as object-to-command transition [15] and takes valuable time, breaking the interaction fow [1, 17]. Keyboard shortcuts are an interesting alternative to toolbars but require users to memorize the shortcuts [30], leading to users only employing them for the most frequently used commands [23]. Contextual menus, such as Marking Menus [25], are not always visible and can only contain a subpart of the toolbar items. Novel gestures on or around the keyboard (e.g. FingerChord [44], Hot-Strokes [12]) can facilitate access to commands but require users to learn and memorize the gestures. Finally, augmenting current input devices or designing new ones can enhance command selection (e.g. LensMouse [42], RPM [36], and TDK [5]). Yet this requires users to upgrade their regular mouse-keyboard desktop environments. In this work, we propose a novel solution to select items on simple or multi-level toolbars, called KeyTch (pronounced Keetch'), which is based on the combination of the KEYboard with a TouCHscreen afxed to it (a smartphone in our implementation). With our solution, users select commands on a toolbar displayed on the touchscreen, through gestures combining key presses and screen touches. Our solution ofers several advantages, as it 1) reduces the object-to-command transition, since the mouse pointer stays on the working object; 2) saves screen real estate by deporting the toolbar to the secondary touchscreen and 3) employs an input device that is at user's disposal, i.e. a regular smartphone. In a preliminary study, we frst validate the screen reachability of the gestures combining a key press and a touch on the touchscreen. Then, we compare diferent interaction techniques using KeyTch with a mouse-based baseline for command selection. In a frst user study, we focus on one-level toolbars and explore whether interaction with the touchscreen should be based on direct or indirect pointing. A second study focuses on the performance of KeyTch interaction techniques to select commands on a two-level toolbar. In two follow-up studies we further investigate the interaction fow of our techniques as well as their performance without quasi-mode. Finally, we present design guidelines and an adaptation of the MS-Word ribbon on KeyTch. Our contributions include: 1) the design of interaction techniques combining a smartphone and a keyboard for command selection; 2) a preliminary study demonstrating that 80 % of a 5.8 smartphone screen can be reached comfortably while using the keyboard; 3) two users studies establishing the possibility to select up to 720 items with KeyTch more efciently than with a mouse (30% more time required); 4) two follow-up studies confrming the advantages of KeyTch techniques when considering the user's interaction fow.

Cite

CITATION STYLE

APA

Keddisseh, E., Serrano, M., & Dubois, E. (2021). Keytch: Combining the keyboard with a touchscreen for rapid command selection on toolbars. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery. https://doi.org/10.1145/3411764.3445288

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free