A lightweight recurrent network for sequence modeling

5Citations
Citations of this article
127Readers
Mendeley users who have this article in their library.

Abstract

Recurrent networks have achieved great success on various sequential tasks with the assistance of complex recurrent units, but suffer from severe computational inefficiency due to weak parallelization. One direction to alleviate this issue is to shift heavy computations outside the recurrence. In this paper, we propose a lightweight recurrent network, or LRN. LRN uses input and forget gates to handle long-range dependencies as well as gradient vanishing and explosion, with all parameter-related calculations factored outside the recurrence. The recurrence in LRN only manipulates the weight assigned to each token, tightly connecting LRN with self-attention networks. We apply LRN as a drop-in replacement of existing recurrent units in several neural sequential models. Extensive experiments on six NLP tasks show that LRN yields the best running efficiency with little or no loss in model performance.

References Powered by Scopus

Long Short-Term Memory

78428Citations
N/AReaders
Get full text

GloVe: Global vectors for word representation

27195Citations
N/AReaders
Get full text

Learning phrase representations using RNN encoder-decoder for statistical machine translation

11816Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Edinburgh’s End-to-End Multilingual Speech Translation System for IWSLT 2021

5Citations
N/AReaders
Get full text

AAN+: Generalized Average Attention Network for Accelerating Neural Transformer

3Citations
N/AReaders
Get full text

Live Stream Highlight Detection Using Chat Messages

3Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Zhang, B., & Sennrich, R. (2020). A lightweight recurrent network for sequence modeling. In ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (pp. 1538–1548). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p19-1149

Readers over time

‘19‘20‘21‘22‘23‘24‘25010203040

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 36

67%

Researcher 11

20%

Professor / Associate Prof. 4

7%

Lecturer / Post doc 3

6%

Readers' Discipline

Tooltip

Computer Science 55

83%

Linguistics 5

8%

Engineering 3

5%

Business, Management and Accounting 3

5%

Save time finding and organizing research with Mendeley

Sign up for free
0