Leveraging Multi-head Attention Mechanism to Improve Event Detection

2Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Event detection (ED) task aims to automatically identify trigger words from unstructured text. In recent years, neural models with attention mechanism have achieved great success on this task. However, existing attention methods tend to focus on meaningless context words and ignore the semantically rich words, which weakens their ability to recognize trigger words. In this paper, we propose MANN, a multi-head attention mechanism model enhanced by argument knowledge to address the above issues. The multi-head mechanism gives MANN the ability to detect a variety of information in a sentence while argument knowledge acts as a supervisor to further improve the quality of attention. Experimental results show that our approach is significantly superior to existing attention-based models.

Cite

CITATION STYLE

APA

Tong, M., Xu, B., Hou, L., Li, J., & Wang, S. (2019). Leveraging Multi-head Attention Mechanism to Improve Event Detection. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11856 LNAI, pp. 268–280). Springer. https://doi.org/10.1007/978-3-030-32381-3_22

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free