UI designers look for inspirational examples from existing UI designs during the prototyping process. However, they have to reconstruct these example UI designs from scratch to edit content or apply styling. The existing solution attempts to make UI screens into editable vector graphics using image segmentation techniques. In this research, we aim to use deep learning and gestalt laws-based algorithms to convert UI screens to editable blueprints by identifying the constituent UI element categories, their location, dimension, text content, and layout hierarchy. In this paper, we present a proof-of-concept web application that uses the UI screens and annotations from the RICO dataset and generates an editable blueprint vector graphic, and a UI layout tree. With this research, we aim to support UX designers in reconstructing UI screens and communicating UI layout information to developers.
CITATION STYLE
Pandian, V. P. S., Suleri, S., & Jarke, M. (2020). Blu: What GUIs are made of. In International Conference on Intelligent User Interfaces, Proceedings IUI (pp. 81–82). Association for Computing Machinery. https://doi.org/10.1145/3379336.3381497
Mendeley helps you to discover research relevant for your work.