A Meta-Learning Approach to Generating Functional Descriptions of Graphical User Interfaces

0Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Graphical user interfaces (GUIs) are important software engineering artifacts that commonly connect end users with software apps to fulfill requirements. While previous work extracts metadata to build a GUI's search profile and exploits deep learning to caption a GUI, little is known about how to capture the GUI's essential functionality. We shorten the gap in this paper by presenting a novel approach that leverages fewshot prompting and meta-learning. In particular, we devise four attributes of a GUI in order to instrument the meta-learner. The meta-learner is then integrated with the few-shot learner through an iteration mechanism. We comprehensively evaluate our approach with a GUI description dataset and a GUI tracing dataset. The results show that Qwen is a viable implementation option for our approach, and that our approach outperforms the baseline prompting methods.

Cite

CITATION STYLE

APA

Iluru, N. M., Niu, N., Yang, Y., & Wang, Y. (2025). A Meta-Learning Approach to Generating Functional Descriptions of Graphical User Interfaces. In Proceedings - 2025 IEEE International Conference on Information Reuse and Integration and Data Science, IRI 2025 (pp. 7–12). Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.1109/IRI66576.2025.00010

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free