Functional Distributional Semantics is a linguistically motivated framework for modelling lexical and sentence-level semantics with truthconditional functions using distributional information. Previous implementations of the framework focus on subject-verb-object (SVO) triples only, which largely limits the contextual information available for training and thus the capability of the learnt model. In this paper, we discuss the challenges of extending the previous architectures to training on arbitrary sentences. We address the challenges by proposing a more expressive lexical model that works over a continuous semantic space. This improves the flexibility and computational efficiency of the model, as well as its compatibility with present-day machine-learning frameworks. Our proposal allows the model to be applied to a wider range of semantic tasks, and improved performances are demonstrated from experimental results.
CITATION STYLE
Lo, C. H., Cheng, H., Lam, W., & Emerson, G. (2023). Functional Distributional Semantics at Scale. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 423–436). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.starsem-1.37
Mendeley helps you to discover research relevant for your work.