Verb metaphor detection via contextual relation learning

29Citations
Citations of this article
69Readers
Mendeley users who have this article in their library.

Abstract

Correct natural language understanding requires computers to distinguish the literal and metaphorical senses of a word. Recent neural models achieve progress on verb metaphor detection by viewing it as sequence labeling. In this paper, we argue that it is appropriate to view this task as relation classification between a verb and its various contexts. We propose the Metaphor-relation BERT (MrBERT) model, which explicitly models the relation between a verb and its grammatical, sentential and semantic contexts. We evaluate our method on the VUA, MOH-X and TroFi datasets. Our method gets competitive results compared with state-of-the-art approaches.

Cite

CITATION STYLE

APA

Song, W., Zhou, S., Fu, R., Liu, T., & Liu, L. (2021). Verb metaphor detection via contextual relation learning. In ACL-IJCNLP 2021 - 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Proceedings of the Conference (pp. 4240–4251). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.acl-long.327

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free