Stubborn Lexical Bias in Data and Models

2Citations
Citations of this article
21Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In NLP, recent work has seen increased focus on spurious correlations between various features and labels in training data, and how these influence model behavior. However, the presence and effect of such correlations are typically examined feature by feature. We investigate the cumulative impact on a model of many such intersecting features. Using a new statistical method, we examine whether such spurious patterns in data appear in models trained on the data. We select two tasks-natural language inference and duplicate-question detection-for which any unigram feature on its own should ideally be uninformative, which gives us a large pool of automatically extracted features with which to experiment. The large size of this pool allows us to investigate the intersection of features spuriously associated with (potentially different) labels. We then apply an optimization approach to reweight the training data, reducing thousands of spurious correlations, and examine how doing so affects models trained on the reweighted data. Surprisingly, though this method can successfully reduce lexical biases in the training data, we still find strong evidence of corresponding bias in the trained models, including worsened bias for slightly more complex features (bigrams). We close with discussion about the implications of our results on what it means to “debias” training data, and how issues of data quality can affect model bias.

Cite

CITATION STYLE

APA

Serrano, S., Dodge, J., & Smith, N. A. (2023). Stubborn Lexical Bias in Data and Models. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 8131–8146). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.findings-acl.516

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free