An assessment framework for dialport

0Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Collecting a large amount of real human-computer interaction data in various domains is a cornerstone in the development of better data-driven spoken dialog systems. The DialPort project is creating a portal to collect a constant stream of real user conversational data on a variety of topics. In order to keep real users attracted to DialPort, it is crucial to develop a robust evaluation framework to monitor and maintain high performance. Different from earlier spoken dialog systems, DialPort has a heterogeneous set of spoken dialog systems gathered under one outward-looking agent. In order to access this new structure, we have identified some unique challenges that DialPort will encounter so that it can appeal to real users and have created a novel evaluation scheme that quantitatively assesses their performance in these situations. We look at assessment from the point of view of the system developer as well as that of the end user.

Author supplied keywords

Cite

CITATION STYLE

APA

Lee, K., Zhao, T., Ultes, S., Rojas-Barahona, L., Pincus, E., Traum, D., & Eskenazi, M. (2019). An assessment framework for dialport. In Lecture Notes in Electrical Engineering (Vol. 510, pp. 79–85). Springer Verlag. https://doi.org/10.1007/978-3-319-92108-2_10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free