Recursive Neural Networks applied to Discourse Representation Theory

4Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Connectionist semantic modeling in natural language processing (a typical symbolic domain) is still a challenging problem. This paper introduces a novel technique, combining Discourse Representation Theory (DRT) with Recursive Neural Networks (RNN) in order to yield a neural model capable to discover properties and relationships among constituents of a knowledge-base expressed by natural language sentences. DRT transforms sequences of sentences into directed ordered acyclic graphs, while RNNs are trained to deal with such structured data. The acquired information allows the network to reply on questions, the answers of which are not directly expressed into the knowledge-base. A simple experimental demonstration, drawn from the context of a fairy tales is presented. Finally, on-going research direction are pointed-out. © Springer-Verlag Berlin Heidelberg 2002.

Cite

CITATION STYLE

APA

Bua, A., Gori, M., & Santini, F. (2002). Recursive Neural Networks applied to Discourse Representation Theory. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2415 LNCS, pp. 290–295). Springer Verlag. https://doi.org/10.1007/3-540-46084-5_48

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free