Mining Logical Event Schemas From Pre-Trained Language Models

2Citations
Citations of this article
41Readers
Mendeley users who have this article in their library.

Abstract

We present NESL (the Neuro-Episodic Schema Learner), an event schema learning system that combines large language models, FrameNet parsing, a powerful logical representation of language, and a set of simple behavioral schemas meant to bootstrap the learning process. In lieu of a pre-made corpus of stories, our dataset is a continuous feed of “situation samples” from a pre-trained language model, which are then parsed into FrameNet frames, mapped into simple behavioral schemas, and combined and generalized into complex, hierarchical schemas for a variety of everyday scenarios. We show that careful sampling from the language model can help emphasize stereotypical properties of situations and de-emphasize irrelevant details, and that the resulting schemas specify situations more comprehensively than those learned by other systems.

Cite

CITATION STYLE

APA

Lawley, L., & Schubert, L. (2022). Mining Logical Event Schemas From Pre-Trained Language Models. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 332–345). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-srw.25

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free