More Parameters? No Thanks!

0Citations
Citations of this article
50Readers
Mendeley users who have this article in their library.

Abstract

This work studies the long-standing problems of model capacity and negative interference in multilingual neural machine translation (MNMT). We use network pruning techniques and observe that pruning 50-70% of the parameters from a trained MNMT model results only in a 0.29-1.98 drop in the BLEU score. Suggesting that there exist large redundancies in MNMT models. These observations motivate us to use the redundant parameters and counter the interference problem efficiently. We propose a novel adaptation strategy, where we iteratively prune and retrain the redundant parameters of an MNMT to improve bilingual representations while retaining the multilinguality. Negative interference severely affects high resource languages, and our method alleviates it without any additional adapter modules. Hence, we call it parameter-free adaptation strategy, paving way for the efficient adaptation of MNMT. We demonstrate the effectiveness of our method on a 9 language MNMT trained on TED talks, and report an average improvement of +1.36 on high resource pairs. Code will be released here.

References Powered by Scopus

Overcoming catastrophic forgetting in neural networks

5069Citations
N/AReaders
Get full text

Neural machine translation of rare words with subword units

4499Citations
N/AReaders
Get full text

A Call for Clarity in Reporting BLEU Scores

2001Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Khan, Z., Akella, K., Namboodiri, V. P., & Jawahar, C. V. (2021). More Parameters? No Thanks! In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (pp. 96–102). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-acl.9

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 14

74%

Researcher 3

16%

Professor / Associate Prof. 1

5%

Lecturer / Post doc 1

5%

Readers' Discipline

Tooltip

Computer Science 19

76%

Linguistics 4

16%

Neuroscience 1

4%

Social Sciences 1

4%

Save time finding and organizing research with Mendeley

Sign up for free