Neural decipherment via minimum-cost flow: From ugaritic to linear B

19Citations
Citations of this article
162Readers
Mendeley users who have this article in their library.

Abstract

In this paper we propose a novel neural approach for automatic decipherment of lost languages. To compensate for the lack of strong supervision signal, our model design is informed by patterns in language change documented in historical linguistics. The model utilizes an expressive sequence-to-sequence model to capture character-level correspondences between cognates. To effectively train the model in an unsupervised manner, we innovate the training procedure by formalizing it as a minimum-cost flow problem. When applied to the decipherment of Ugaritic, we achieve a 5.5% absolute improvement over state-of-the-art results. We also report the first automatic results in deciphering Linear B, a syllabic language related to ancient Greek, where our model correctly translates 67.3% of cognates.

Cite

CITATION STYLE

APA

Luo, J., Cao, Y., & Barzilay, R. (2020). Neural decipherment via minimum-cost flow: From ugaritic to linear B. In ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (pp. 3146–3155). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p19-1303

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free