Abstract
State-of-the-art argument mining studies have advanced the techniques for predicting argument structures. However, the technology for capturing non-tree-structured arguments is still in its infancy. In this paper, we focus on non-tree argument mining with a neural model. We jointly predict proposition types and edges between propositions. Our proposed model incorporates (i) task-specific parameterization (TSP) that effectively encodes a sequence of propositions and (ii) a proposition-level biaffine attention (PLBA) that can predict a non-tree argument consisting of edges. Experimental results show that both TSP and PLBA boost edge prediction performance compared to baselines.
Cite
CITATION STYLE
Morio, G., Ozaki, H., Morishita, T., Koreeda, Y., & Yanai, K. (2020). Towards better non-tree argument mining: Proposition-level biaffine parsing with task-specific parameterization. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 3259–3266). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.acl-main.298
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.