Intentional context in situated natural language learning

24Citations
Citations of this article
135Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Natural language interfaces designed for situationally embedded domains (e.g. cars, videogames) must incorporate knowledge about the users' context to address the many ambiguities of situated language use. We introduce a model of situated language acquisition that operates in two phases. First, intentional context is represented and inferred from user actions using probabilistic context free grammars. Then, utterances are mapped onto this representation in a noisy channel framework. The acquisition model is trained on unconstrained speech collected from subjects playing an interactive game, and tested on an understanding task. © 2005 Association for Computational Linguistics.

Cite

CITATION STYLE

APA

Fleischman, M., & Roy, D. (2005). Intentional context in situated natural language learning. In CoNLL 2005 - Proceedings of the Ninth Conference on Computational Natural Language Learning (pp. 104–111). Association for Computational Linguistics (ACL). https://doi.org/10.3115/1706543.1706562

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free