Abstract
Continuous-space translation models have recently emerged as extremely powerful ways to boost the performance of existing translation systems. A simple, yet effective way to integrate such models in inference is to use them in an N-best rescoring step. In this paper, we focus on this scenario and show that the performance gains in rescoring can be greatly increased when the neural network is trained jointly with all the other model parameters, using an appropriate objective function. Our approach is validated on two domains, where it outperforms strong baselines.
Cite
CITATION STYLE
Do, Q. K., Allauzen, A., & Yvon, F. (2015). A discriminative training procedure for continuous translation models. In Conference Proceedings - EMNLP 2015: Conference on Empirical Methods in Natural Language Processing (pp. 1046–1052). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/d15-1121
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.