This study presents new experiments on Zyrian Komi speech recognition. We use Deep-Speech to train ASR models from a language documentation corpus that contains both contemporary and archival recordings. Earlier studies have shown that transfer learning from English and using a domain matching Komi language model both improve the CER and WER. In this study we experiment with transfer learning from a more relevant source language, Russian, and including Russian text in the language model construction. The motivation for this is that Russian and Komi are contemporary contact languages, and Russian is regularly present in the corpus. We found that despite the close contact of Russian and Komi, the size of the English speech corpus yielded greater performance when used as the source language. Additionally, we can report that already an update in DeepSpeech version improved the CER by 3.9% against the earlier studies, which is an important step in the development of Komi ASR.
CITATION STYLE
Hjortnæs, N., Partanen, N., Rießler, M., & Tyers, F. M. (2021). The Relevance of the Source Language in Transfer Learning for ASR. Proceedings of the Workshop on Computational Methods for Endangered Languages, 1(2). https://doi.org/10.33011/computel.v1i.959
Mendeley helps you to discover research relevant for your work.