The observed pronunciations or spellings of words are often explained as arising from the “underlying forms” of their morphemes. These forms are latent strings that linguists try to reconstruct by hand. We propose to reconstruct them automatically at scale, enabling generalization to new words. Given some surface word types of a concatenative language along with the abstract morpheme sequences that they express, we show how to recover consistent underlying forms for these morphemes, together with the (stochastic) phonology that maps each concatenation of underlying forms to a surface form. Our technique involves loopy belief propagation in a natural directed graphical model whose variables are unknown strings and whose conditional distributions are encoded as finite-state machines with trainable weights. We define training and evaluation paradigms for the task of surface word prediction, and report results on subsets of 7 languages.
CITATION STYLE
Cotterell, R., Peng, N., & Eisner, J. (2015). Modeling Word Forms Using Latent Underlying Morphs and Phonology. Transactions of the Association for Computational Linguistics, 3, 433–447. https://doi.org/10.1162/tacl_a_00149
Mendeley helps you to discover research relevant for your work.