Learning PyTorch Through A Neural Dependency Parsing Exercise

0Citations
Citations of this article
47Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Dependency parsing is increasingly the popular parsing formalism in practice. This assignment provides a practice exercise in implementing the shift-reduce dependency parser of Chen and Manning (2014). This parser is a two-layer feed-forward neural network, which students implement in PyTorch, providing practice in developing deep learning models and exposure to developing parser models.

Cite

CITATION STYLE

APA

Jurgens, D. (2021). Learning PyTorch Through A Neural Dependency Parsing Exercise. In Teaching NLP 2021 - Proceedings of the 5th Workshop on Teaching Natural Language Processing (pp. 62–64). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.teachingnlp-1.10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free