Abstract
Multi-task Learning methods have achieved significant progress in text classification. However, existing methods assume that multi-task text classification problems are convex multiobjective optimization problems, which is unrealistic in real-world applications. To address this issue, this paper presents a novel Tchebycheff procedure to optimize the multitask classification problems without any convex assumption. The extensive experiments back up our theoretical analysis and validate the superiority of our proposals.
Cite
CITATION STYLE
Mao, Y., Yun, S., Liu, W., & Du, B. (2020). Tchebycheff procedure for multi-task text classification. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 4217–4226). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.acl-main.388
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.