Continual Adaptation for Efficient Machine Communication

16Citations
Citations of this article
89Readers
Mendeley users who have this article in their library.

Abstract

To communicate with new partners in new contexts, humans rapidly form new linguistic conventions. Recent neural language models are able to comprehend and produce the existing conventions present in their training data, but are not able to flexibly and interactively adapt those conventions on the fly as humans do. We introduce an interactive repeated reference task as a benchmark for models of adaptation in communication and propose a regularized continual learning framework that allows an artificial agent initialized with a generic language model to more accurately and efficiently communicate with a partner over time. We evaluate this framework through simulations on COCO and in real-time reference game experiments with human partners.

Cite

CITATION STYLE

APA

Hawkins, R. D., Kwon, M., Sadigh, D., & Goodman, N. D. (2020). Continual Adaptation for Efficient Machine Communication. In CoNLL 2020 - 24th Conference on Computational Natural Language Learning, Proceedings of the Conference (pp. 408–419). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.conll-1.33

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free