We present a state-of-the-art neural approach to the unsupervised reconstruction of ancient word forms. Previous work in this domain used expectation-maximization to predict simple phonological changes between ancient word forms and their cognates in modern languages. We extend this work with neural models that can capture more complicated phonological and morphological changes. At the same time, we preserve the inductive biases from classical methods by building monotonic alignment constraints into the model and deliberately underfitting during the maximization step. We evaluate our performance on the task of reconstructing Latin from a dataset of cognates across five Romance languages, achieving a notable reduction in edit distance from the target word forms compared to previous methods.
CITATION STYLE
He, A., Tomlin, N., & Klein, D. (2023). Neural Unsupervised Reconstruction of Protolanguage Word Forms. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 1636–1649). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-long.91
Mendeley helps you to discover research relevant for your work.