Learning from Missing Relations: Contrastive Learning with Commonsense Knowledge Graphs for Commonsense Inference

7Citations
Citations of this article
50Readers
Mendeley users who have this article in their library.

Abstract

Commonsense inference poses a unique challenge to reason and generate the physical, social, and causal conditions of a given event. Existing approaches to commonsense inference utilize commonsense transformers, which are large-scale language models that learn commonsense knowledge graphs. However, they suffer from a lack of coverage and expressive diversity of the graphs, resulting in a degradation of the representation quality. In this paper, we focus on addressing missing relations in commonsense knowledge graphs, and propose a novel contrastive learning framework called SOLAR. Our framework contrasts sets of semantically similar and dissimilar events, learning richer inferential knowledge compared to existing approaches. Empirical results demonstrate the efficacy of SOLAR in commonsense inference of diverse commonsense knowledge graphs. Specifically, SOLAR outperforms the state-of-the-art commonsense transformer on commonsense inference with ConceptNet by 1.84% on average among 8 automatic evaluation metrics. In-depth analysis of SOLAR sheds light on the effects of the missing relations utilized in learning commonsense knowledge graphs.

Cite

CITATION STYLE

APA

Jung, Y. H., Park, J. H., Choi, J. Y., Lee, M., Kim, J., Kim, K. M., & Lee, S. K. (2022). Learning from Missing Relations: Contrastive Learning with Commonsense Knowledge Graphs for Commonsense Inference. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 1514–1523). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.findings-acl.119

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free