Graphical user interfaces are usually first sketched out manually as hand drawing pictures and then must be realized by software developers to become prototypes or usable user interfaces. This motivates our proposal of a smart CASE tool that can understand hand drawing sketches of graphical user interfaces, including forms and their navigations, then automatically transform such draft designs into real user interfaces of a prototype or an application. By using the ideas of modeling and model-transformation in model driven engineering, the authors also propose a mechanism to generate graphical user interfaces as forms targeting different platforms. Experimental results show that our sketch recognition to understand hand drawing graphical user interfaces can achieve the accuracy of 97.86% and 95% in recognizing 7 common UI controls and arrows for navigation respectively. Our model transformation engine can generate user interfaces as forms for applications on 3 different platforms of mobile devices, including Windows Phone, Android, and iOS. This approach follows the trend to develop a new generation of smart CASE tools that can understand and interpret conceptual software design models into concrete software elements and components to assist the software development process in a natural way. © 2014 Springer International Publishing.
CITATION STYLE
Nguyen, V. T., Tran, M. T., & Duong, A. D. (2014). Picture-driven user interface development for applications on multi-platforms. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8510 LNCS, pp. 350–360). Springer Verlag. https://doi.org/10.1007/978-3-319-07233-3_33
Mendeley helps you to discover research relevant for your work.