Abstract
In this paper, we describe our approaches and systems for the SemEval-2020 Task 11 on propaganda technique detection. We fine-tuned BERT and RoBERTa pre-trained models then merged them with an average ensemble. We conducted several experiments for input representations dealing with long texts and preserving context as well as for the imbalanced class problem. Our system ranked 20th out of 36 teams with 0.398 F1 in the SI task and 14th out of 31 teams with 0.556 F1 in the TC task. Our code is available at https://github.com/amenra99/SemEval2020_Task11.
Cite
CITATION STYLE
Kim, M., & Bethard, S. (2020). TTUI at SemEval-2020 Task 11: Propaganda Detection with Transfer learning and Ensembles. In 14th International Workshops on Semantic Evaluation, SemEval 2020 - co-located 28th International Conference on Computational Linguistics, COLING 2020, Proceedings (pp. 1829–1834). International Committee for Computational Linguistics. https://doi.org/10.18653/v1/2020.semeval-1.240
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.