Megamodeling and Metamodel-Driven Engineering for Plastic User Interfaces: MEGA-UI

  • Sottet J
  • Calvary G
  • Favre J
  • et al.
N/ACitations
Citations of this article
19Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Models are not new in Human Computer Interaction (HCI). Consider all the Model-Based Interface Design Environments (MB-IDE) that emerged in the 1990s for generating User Interfaces (UI) from more abstract descriptions. Unfortunately, the resulting poor usability killed the approach, burying the models in HCI for a long time until new requirements sprung, pushed by ubiquitous computing (e.g., the need for device independence). These requirements, bolstered by the large effort expended in Model-Driven Engineering (MDE) by the Software Engineering (SE) community, have brought the models back to life in HCI. This paper utilizes both the know-how in HCI and recent advances in MDE to address the challenge of engineering Plastic UIs, i.e., UIs capable of adapting to their context of use (User, Platform, Environment) while preserving usability. Although most of the work has concentrated on the func- tional aspect of adaptation so far, this chapter focuses on usability. The point is to ac- knowledge the strength of keeping trace of the UIs design rationale at runtime so as to make it possible for the system to reason about its own design when the context of use changes. As design transformations link together different perspectives on the same UI (e.g., users tasks and workspaces for spatially grouping items together), the paper claims for embedding a graph that depicts a UI from different perspectives at runtime while explaining its design rationale. This meets the notion of Megamodel as pro- moted in MDE. The first Megamodel was used to make explicit the relations between the core concepts of MDE: System, Model, Metamodel, Mapping, and Transforma- tion. When transposed to HCI, the Megamodel gives rise to the notion of Mega-UI that makes it possible for the user (designer and/or end-user) to browse and/or control the system from different levels of abstraction (e.g., users tasks, workspaces, interactors, code) and different levels of genericity (e.g., model, metamodel, meta-metamodel). Yet, a first prototype (a rapid prototyping tool) has been implemented using general MDE tools (e.g., EMF, ATL). So far, the effort has been directed on the subset of the graph that links together different perspectives on the same UI including its mapping on the platform. Via an Extra-UI, the designer controls the UIs molding and distribu- tion based on a library of self-explanative transformations. Extra-UIs were previously called Meta-UIs. But as Meta is confusing with the same Meta prefix in MDE, we prefer the prefix Extra to assess there is no change of level of genericity. By contrast, the Meta-UI manipulates upper levels of genericity (Meta levels in MDE) for making it possible for the user (designer and/or end-user) to observe and/or define languages for specifying UIs and Meta-UIs. Meta-UIs is the next step in our research agenda. Mega-UI is the overall UI that encompasses UIs, Extra-UIs, and Meta-UIs.

Cite

CITATION STYLE

APA

Sottet, J.-S., Calvary, G., Favre, J.-M., & Coutaz, J. (2009). Megamodeling and Metamodel-Driven Engineering for Plastic User Interfaces: MEGA-UI (pp. 173–200). https://doi.org/10.1007/978-1-84800-907-3_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free