Abstract
In a context of offensive content mediation on social media now regulated by European laws, it is important not only to be able to automatically detect sexist content but also to identify if a message with a sexist content is really sexist or is a story of sexism experienced by a woman. We propose: (1) a new characterization of sexist content inspired by speech acts theory and discourse analysis studies, (2) the first French dataset annotated for sexism detection, and (3) a set of deep learning experiments trained on top of a combination of several tweet's vectorial representations (word embeddings, linguistic features, and various generalization strategies). Our results are encouraging and constitute a first step towards offensive content moderation.
Cite
CITATION STYLE
Chiril, P., Moriceau, V., Benamara, F., Mari, A., Origgi, G., & Coulomb-Gully, M. (2020). He said “who’s gonna take care of your children when you are at ACL?”: Reported Sexist Acts are Not Sexist. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 4055–4066). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.acl-main.373
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.