Shortcut sequence tagging

1Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Deep stacked RNNs are usually hard to train. Recent studies have shown that shortcut connections across different RNN layers bring substantially faster convergence. However, shortcuts increase the computational complexity of the recurrent computations. To reduce the complexity, we propose the shortcut block, which is a refinement of the shortcut LSTM blocks. Our approach is to replace the self-connected parts (Ctl) with shortcuts (htl-2) in the internal states. We present extensive empirical experiments showing that this design performs better than the original shortcuts. We evaluate our method on CCG supertagging task, obtaining a 8% relatively improvement over current state-of-the-art results.

Cite

CITATION STYLE

APA

Wu, H., Zhang, J., & Zong, C. (2018). Shortcut sequence tagging. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10619 LNAI, pp. 196–207). Springer Verlag. https://doi.org/10.1007/978-3-319-73618-1_17

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free