Learning tag dependencies for sequence tagging

24Citations
Citations of this article
32Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Sequence tagging is the basis for multiple applications in natural language processing. Despite successes in learning long term token sequence dependencies with neural network, tag dependencies are rarely considered previously. Sequence tagging actually possesses complex dependencies and interactions among the input tokens and the output tags. We propose a novel multi-channel model, which handles different ranges of token-tag dependencies and their interactions simultaneously. A tag LSTM is augmented to manage the output tag dependencies and word-tag interactions, while three mechanisms are presented to efficiently incorporate token context representation and tag dependency. Extensive experiments on part-of-speech tagging and named entity recognition tasks show that the proposed model outperforms the BiLSTM-CRF baseline by effectively incorporating the tag dependency feature.

Cite

CITATION STYLE

APA

Zhang, Y., Chen, H., Zhao, Y., Liu, Q., & Yin, D. (2018). Learning tag dependencies for sequence tagging. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2018-July, pp. 4581–4587). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2018/637

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free