Recurrent Neural Networks: Modeling Sequences and Stacks

  • Goldberg Y
N/ACitations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

When dealing with language data, it is very common to work with sequences, such as words (sequences of letters), sentences (sequences of words), and documents. We saw how feed-forward networks can accommodate arbitrary feature functions over sequences through the use...

Cite

CITATION STYLE

APA

Goldberg, Y. (2017). Recurrent Neural Networks: Modeling Sequences and Stacks (pp. 163–175). https://doi.org/10.1007/978-3-031-02165-7_14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free