Abstract
This paper presents experiments on morphological inflection using data from the SIGMORPHON-UniMorph 2022 Shared Task 0: Generalization and Typologically Diverse Morphological Inflection. We present a transformer inflection system, which enriches the standard transformer architecture with reverse positional encoding and type embeddings. We further apply data hallucination and lemma copying to augment training data. We train models using a two-stage procedure: (1) We first train on the augmented training data using standard backpropagation and teacher forcing. (2) We then continue training with a variant of the scheduled sampling algorithm dubbed student forcing. Our system delivers competitive performance under the small and large data conditions on the shared task datasets.
Cite
CITATION STYLE
Yang, C., Yang, R., Nicolai, G., & Silfverberg, M. (2022). Generalizing Morphological Inflection Systems to Unseen Lemmas. In SIGMORPHON 2022 - 19th SIGMORPHON Workshop on Computational Research in Phonetics, Phonology, and Morphology, Proceedings of the Workshop (pp. 226–235). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.sigmorphon-1.23
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.