In settings with related prediction tasks, integrated multi-task learning models can often improve performance relative to independent single-task models. However, even when the average task performance improves, individual tasks may experience negative transfer in which the multi-task model's predictions are worse than the single-task model's. We show the prevalence of negative transfer in a computational chemistry case study with 128 tasks and introduce a framework that provides a foundation for reducing negative transfer in multitask models. Our Loss-Balanced Task Weighting approach dynamically updates task weights during model training to control the influence of individual tasks.
CITATION STYLE
Liu, S., Liang, Y., & Gitter, A. (2019). Loss-Balanced task weighting to reduce negative transfer in multi-task learning. In 33rd AAAI Conference on Artificial Intelligence, AAAI 2019, 31st Innovative Applications of Artificial Intelligence Conference, IAAI 2019 and the 9th AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019 (pp. 9977–9978). AAAI Press. https://doi.org/10.1609/aaai.v33i01.33019977
Mendeley helps you to discover research relevant for your work.