Shared-hidden-layer deep neural network for under-resourced language the content

6Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

Training speech recognizer with under-resourced language data still proves difficult. Indonesian language is considered under-resourced because the lack of a standard speech corpus, text corpus, and dictionary. In this research, the efficacy of augmenting limited Indonesian speech training data with highly-resourced-language training data, such as English, to train Indonesian speech recognizer was analyzed. The training was performed in form of shared-hidden-layer deep-neural-network (SHL-DNN) training. An SHL-DNN has language-independent hidden layers and can be pre-trained and trained using multilingual training data without any difference with a monolingual deep neural network. The SHL-DNN using Indonesian and English speech training data proved effective for decreasing word error rate (WER) in decoding Indonesian dictated-speech by achieving 3.82% absolute decrease compared to a monolingual Indonesian hidden Markov model using Gaussian mixture model emission (GMM-HMM). The case was confirmed when the SHL-DNN was also employed to decode Indonesian spontaneous-speech by achieving 4.19% absolute WER decrease.

Cite

CITATION STYLE

APA

Hoesen, D., Lestari, D. P., & Widyantoro, D. H. (2018). Shared-hidden-layer deep neural network for under-resourced language the content. Telkomnika (Telecommunication Computing Electronics and Control), 16(3), 1226–1238. https://doi.org/10.12928/TELKOMNIKA.v16i3.7984

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free