Character sequence-to-sequence model with global attention for universal morphological reinflection

3Citations
Citations of this article
62Readers
Mendeley users who have this article in their library.

Abstract

This paper presents a neural network based approach for the CoNLL-SIGMORPHON-2017 Shared Task 1 on morphological reinflection. We propose an encoder-decoder architecture to model this morphological reinflection problem. For an input word, every character is encoded through a Bi-directional Gated Recurrent Unit (GRU) network. Another GRU network is deployed as a decoder to generate the inflection. We participate in Task 1, which includes 52 languages without using external resources. In each language, three training sets are provided (high, medium, and low respectively represent the amount of training data; Scottish Gaelic only has medium and low), totally 155 training sets. Due to time constraints, we only search for optimized parameters of our model based on the Albanian dataset. The source code of our model is available at https://github.com/valdersoul/conll2017.

Cite

CITATION STYLE

APA

Zhu, Q., Li, Y., & Li, X. (2017). Character sequence-to-sequence model with global attention for universal morphological reinflection. In CoNLL 2017 - Proceedings of the CoNLL SIGMORPHON 2017 Shared Task: Universal Morphological Reinflection (pp. 85–89). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/k17-2009

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free