Multi-Task Attentive Residual Networks for Argument Mining

8Citations
Citations of this article
24Readers
Mendeley users who have this article in their library.

Abstract

We explore the use of residual networks and neural attention for multiple argument mining tasks. We propose a residual architecture that exploits attention, multi-task learning, and makes use of ensemble, without any assumption on document or argument structure. We present an extensive experimental evaluation on five different corpora of user-generated comments, scientific publications, and persuasive essays. Our results show that our approach is a strong competitor against state-of-the-art architectures with a higher computational footprint or corpus-specific design, representing an interesting compromise between generality, performance accuracy and reduced model size.

Cite

CITATION STYLE

APA

Galassi, A., Lippi, M., & Torroni, P. (2023). Multi-Task Attentive Residual Networks for Argument Mining. IEEE/ACM Transactions on Audio Speech and Language Processing, 31, 1877–1892. https://doi.org/10.1109/TASLP.2023.3275040

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free