This paper formalizes a sound extension of dynamic oracles to global training, in the frame of transition-based dependency parsers. By dispensing with the precomputation of references, this extension widens the training strategies that can be entertained for such parsers; we show this by revisiting two standard training procedures, early-update and max-violation, to correct some of their search space sampling biases. Experimentally, on the SPMRL treebanks, this improvement increases the similarity between the train and test distributions and yields performance improvements up to 0.7 UAS, without any computation overhead.
Mendeley helps you to discover research relevant for your work.
CITATION STYLE
Aufrant, L., Wisniewski, G., & Yvon, F. (2017). Don’t Stop Me Now! Using global dynamic oracles to correct training biases of transition-based dependency parsers. In 15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017 - Proceedings of Conference (Vol. 2, pp. 318–323). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/e17-2051