Generalized relations in linguistics and cognition

7Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Categorical compositional models of natural language exploit grammatical structure to calculate the meaning of sentences from the meanings of individual words. This approach outperforms conventional techniques for some standard NLP tasks. More recently, similar compositional techniques have been applied to conceptual space models of cognition. Compact closed categories, particularly the category of finite dimensional vector spaces, have been the most common setting for categorical compositional models. When addressing a new problem domain, such as conceptual space models of meaning, a key problem is finding a compact closed category that captures the features of interest. We propose categories of generalized relations as source of new, practical models for cognition and NLP. We demonstrate using detailed examples that phenomena such as fuzziness, metrics, convexity, semantic ambiguity and meaning that varies with context can all be described by relational models. Crucially, by exploiting a technical framework described in previous work of the authors, we also show how we can combine multiple features into a single model, providing a flexible family of new categories for categorical compositional modelling.

Cite

CITATION STYLE

APA

Coecke, B., Genovese, F., Lewis, M., & Marsden, D. (2017). Generalized relations in linguistics and cognition. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10388 LNCS, pp. 256–270). Springer Verlag. https://doi.org/10.1007/978-3-662-55386-2_18

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free