Structured Refinement for Sequential Labeling

0Citations
Citations of this article
50Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Filtering target-irrelevant information through hierarchically refining hidden states has been demonstrated to be effective for obtaining informative representations. However, previous work simply relies on locally normalized attention without considering possible labels at other time steps, the capacity for modeling long-term dependency relations is thus limited. In this paper, we propose to extend previous work with globally normalized attention, e.g., structured attention, to leverage structural information for more effective representation refinement. We also propose two implementation tricks to accelerate CRF computation and an initialization trick for Chinese character embeddings to further improve performance. We provide extensive experimental results on various datasets to show the effectiveness and efficiency of our proposed method.

Cite

CITATION STYLE

APA

Wang, Y., Shindo, H., Matsumoto, Y., & Watanabe, T. (2021). Structured Refinement for Sequential Labeling. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (pp. 1873–1884). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-acl.164

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free