Low-resource language translation is a challenging but socially valuable NLP task. Building on recent work adapting the Transformer’s normalization to this setting, we propose QKNORM, a normalization technique that modifies the attention mechanism to make the softmax function less prone to arbitrary saturation without sacrificing expressivity. Specifically, we apply `2 normalization along the head dimension of each query and key matrix prior to multiplying them and then scale up by a learnable parameter instead of dividing by the square root of the embedding dimension. We show improvements averaging 0.928 BLEU over state-of-the-art bilingual benchmarks for 5 low-resource translation pairs from the TED Talks corpus and IWSLT’15.
CITATION STYLE
Henry, A., Dachapally, P. R., Pawar, S., & Chen, Y. (2020). Query-key normalization for transformers. In Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 (pp. 4246–4253). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.findings-emnlp.379
Mendeley helps you to discover research relevant for your work.