A framework to develop adaptive multimodal dialog systems for Android-based mobile devices

6Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Mobile devices programming has emerged as a new trend in software development. The main developers of operating systems for such devices have provided APIs for developers to implement their own applications, including different solutions for developing voice control. Android, the most popular alternative among developers, offers libraries to build interfaces including different resources for graphical layouts as well as speech recognition and text-to-speech synthesis. Despite the usefulness of such classes, there are no strategies defined for multimodal interface development for Android systems, and developers create ad-hoc solutions that make apps costly to implement and difficult to compare and maintain. In this paper we propose a framework to facilitate the software engineering life cycle for multimodal interfaces in Android. Our proposal integrates the facilities of the Android API in a modular architecture that emphasizes interaction management and context-awareness to build sophisticated, robust and maintainable applications. © 2014 Springer International Publishing.

Cite

CITATION STYLE

APA

Griol, D., & Molina, J. M. (2014). A framework to develop adaptive multimodal dialog systems for Android-based mobile devices. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8480 LNAI, pp. 25–36). Springer Verlag. https://doi.org/10.1007/978-3-319-07617-1_3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free