When dealing with language data, it is very common to work with sequences, such as words (sequences of letters), sentences (sequences of words), and documents. We saw how feed-forward networks can accommodate arbitrary feature functions over sequences through the use...
CITATION STYLE
Goldberg, Y. (2017). Recurrent Neural Networks: Modeling Sequences and Stacks (pp. 163–175). https://doi.org/10.1007/978-3-031-02165-7_14
Mendeley helps you to discover research relevant for your work.