Knowledge-Aware Leap-LSTM: Integrating Prior Knowledge into Leap-LSTM towards Faster Long Text Classification

13Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

Abstract

While widely used in industry, recurrent neural networks (RNNs) are known to have deficiencies in dealing with long sequences (e.g. slow inference, vanishing gradients etc.). Recent research has attempted to accelerate RNN models by developing mechanisms to skip irrelevant words in input. Due to the lack of labelled data, it remains as a challenge to decide which words to skip, especially for low-resource classification tasks. In this paper, we propose Knowledge-Aware Leap-LSTM (KALL), a novel architecture which integrates prior human knowledge (created either manually or automatically) like in-domain keywords, terminologies or lexicons into Leap-LSTM to partially supervise the skipping process. More specifically, we propose a knowledge-oriented cost function for KALL; furthermore, we propose two strategies to integrate the knowledge: (1) the Factored KALL approach involves a keyword indicator as a soft constraint for the skipping process, and (2) the Gated KALL enforces the inclusion of keywords while maintaining a differentiable network in training. Experiments on different public datasets show that our approaches are 1.1x ~ 2.6x faster than LSTM with better accuracy and 23.6x faster than XLNet in a resourcelimited CPU-only environment.

Cite

CITATION STYLE

APA

Du, J., Huang, Y., & Moilanen, K. (2021). Knowledge-Aware Leap-LSTM: Integrating Prior Knowledge into Leap-LSTM towards Faster Long Text Classification. In 35th AAAI Conference on Artificial Intelligence, AAAI 2021 (Vol. 14B, pp. 12768–12775). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v35i14.17511

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free