DialCrowd: A toolkit for easy dialog system assessment

14Citations
Citations of this article
78Readers
Mendeley users who have this article in their library.

Abstract

When creating a dialog system, developers need to test each version to ensure that it is performing correctly. Recently the trend has been to test on large datasets or to ask many users to try out a system. Crowdsourcing has solved the issue of finding users, but it presents new challenges such as how to use a crowdsourcing platform and what type of test is appropriate. DialCrowd makes system assessment using crowdsourcing easier by providing tools, templates and analytics. This paper describes the services that DialCrowd provides and how it works. It also describes a test of DialCrowd by a group of dialog system developers.

Cite

CITATION STYLE

APA

Lee, K., Zhao, T., Black, A. W., & Eskenazi, M. (2018). DialCrowd: A toolkit for easy dialog system assessment. In SIGDIAL 2018 - 19th Annual Meeting of the Special Interest Group on Discourse and Dialogue - Proceedings of the Conference (pp. 245–248). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w18-5028

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free