Representation of relations by planes in neural network language model

0Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Whole brain architecture (WBA) which uses neural networks to imitate a human brain is attracting increased attention as a promising way to achieve artificial general intelligence, and distributed vector representations of words is becoming recognized as the best way to connect neural networks and knowledge. Distributed representations of words have played a wide range of roles in natural language processing, and they have become increasingly important because of their ability to capture a large amount of syntactic and lexical meanings or relationships. Relation vectors are used to represent relations between words, but this approach has some problems; some relations cannot be easily defined, for example, sibling relations, parent-child relations, and many-to-one relations. To deal with these problems, we have created a novel way of representing relations: we represent relations by planes instead of by vectors, and this increases by more than 10% the accuracy of predicting the relation.

Cite

CITATION STYLE

APA

Ebisu, T., & Ichise, R. (2016). Representation of relations by planes in neural network language model. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9947 LNCS, pp. 300–307). Springer Verlag. https://doi.org/10.1007/978-3-319-46687-3_33

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free