Learning multi-task communication with message passing for sequence learning

20Citations
Citations of this article
37Readers
Mendeley users who have this article in their library.

Abstract

We present two architectures for multi-task learning with neural sequence models. Our approach allows the relationships between different tasks to be learned dynamically, rather than using an ad-hoc pre-defined structure as in previous work. We adopt the idea from message-passing graph neural networks, and propose a general graph multi-task learning framework in which different tasks can communicate with each other in an effective and interpretable way. We conduct extensive experiments in text classification and sequence labelling to evaluate our approach on multi-task learning and transfer learning. The empirical results show that our models not only outperform competitive baselines, but also learn interpretable and transferable patterns across tasks.

Cite

CITATION STYLE

APA

Liu, P., Fu, J., Dong, Y., Qiu, X., & Cheung, J. C. K. (2019). Learning multi-task communication with message passing for sequence learning. In 33rd AAAI Conference on Artificial Intelligence, AAAI 2019, 31st Innovative Applications of Artificial Intelligence Conference, IAAI 2019 and the 9th AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019 (pp. 4360–4367). AAAI Press. https://doi.org/10.1609/aaai.v33i01.33014360

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free