We consider retrofitting structure-aware Transformer language model for facilitating end tasks by proposing to exploit syntactic distance to encode both the phrasal constituency and dependency connection into the language model. A middle-layer structural learning strategy is leveraged for structure integration, accomplished with main semantic task training under multi-task learning scheme. Experimental results show that the retrofitted structure-aware Transformer language model achieves improved perplexity, meanwhile inducing accurate syntactic phrases. By performing structure-aware fine-tuning, our model achieves significant improvements for both semantic- and syntactic-dependent tasks.
CITATION STYLE
Fei, H., Ren, Y., & Ji, D. (2020). Retrofitting structure-aware transformer language model for end tasks. In EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference (pp. 2151–2161). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.emnlp-main.168
Mendeley helps you to discover research relevant for your work.