Using counterexamples, we show that vocabulary size and static and dynamic branching factors are all inadequate as measures of speech recognition complexity of finite state grammars. Information theoretic arguments show that perplexity (the logarithm of which is the familiar entropy) is a more appropriate measure of equivalent choice. It too has certain weaknesses which we discuss. We show that perplexity can also be applied to languages having no obvious statistical description, since an entropy-maximizing probability assignment can be found for any finite-state grammar. Table I shows perplexity values for some well-known speech recognition tasks.Perplexity Vocabulary DynamicPhone Word size branching factorIBM-Lasers 2.14 21.11 1000 1000IBM-Raleigh 1.69 7.74 250 7.32CMU-AIX05 1.52 6.41 1011 35
CITATION STYLE
Jelinek, F., Mercer, R. L., Bahl, L. R., & Baker, J. K. (1977). Perplexity—a measure of the difficulty of speech recognition tasks. The Journal of the Acoustical Society of America, 62(S1), S63–S63. https://doi.org/10.1121/1.2016299
Mendeley helps you to discover research relevant for your work.