Tailoring user interfaces to include gesture-based interaction with gestUI

1Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The development of custom gesture-based user interfaces requires software engineers to be skillful in the use of the tools and languages needed to implement them. gestUI, a model-driven method, can help them achieve these skills by defining custom gestures and including gesture-based interaction in existing user interfaces. Up to now, gestUI has used the same gesture catalogue for all software users, with gestures that could not be subsequently redefined. In this paper, we extend gestUI by including a user profile in the metamodel that permits individual users to define custom gestures and to include gesture-based interaction in user interfaces. Using tailoring mechanisms, each user can redefine his custom gestures during the software runtime. Although both features are supported by models, the gestUI tool hides its technical complexity from the users. We validated these gestUI features in a technical action research in an industrial context. The results showed that these features were perceived as both useful and easy to use when defining/redefining custom gestures and including them in a user interface.

Cite

CITATION STYLE

APA

Parra, O., España, S., & Pastor, O. (2016). Tailoring user interfaces to include gesture-based interaction with gestUI. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9974 LNCS, pp. 496–504). Springer Verlag. https://doi.org/10.1007/978-3-319-46397-1_38

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free