RWTHLM - The RWTH Aachen University neural network language modeling toolkit

32Citations
Citations of this article
49Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We present a novel toolkit that implements the long short-term memory (LSTM) neural network concept for language modeling. The main goal is to provide a software which is easy to use, and which allows fast training of standard recurrent and LSTM neural network language models. The toolkit obtains state-of-the-art performance on the standard Treebank corpus. To reduce the training time, BLAS and related libraries are supported, and it is possible to evaluate multiple word sequences in parallel. In addition, arbitrary word classes can be used to speed up the computation in case of large vocabulary sizes. Finally, the software allows easy integration with SRILM, and it supports direct decoding and rescoring of HTK lattices. The toolkit is available for download under an open source license.

Cite

CITATION STYLE

APA

Sundermeyer, M., Schlüter, R., & Ney, H. (2014). RWTHLM - The RWTH Aachen University neural network language modeling toolkit. In Proceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH (pp. 2093–2097). International Speech and Communication Association. https://doi.org/10.21437/interspeech.2014-475

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free