De-biased Attention Supervision for Text Classification with Causality

5Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

In text classification models, while the unsupervised attention mechanism can enhance performance, it often produces attention distributions that are puzzling to humans, such as assigning high weight to seemingly insignificant conjunctions. Recently, numerous studies have explored Attention Supervision (AS) to guide the model toward more interpretable attention distributions. However, such AS can impact classification performance, especially in specialized domains. In this paper, we address this issue from a causality perspective. Firstly, we leverage the causal graph to reveal two biases in the AS: 1) Bias caused by the label distribution of the dataset. 2) Bias caused by the words’ different occurrence ranges that some words can occur across labels while others only occur in a particular label. We then propose a novel De-biased Attention Supervision (DAS) method to eliminate these biases with causal techniques. Specifically, we adopt backdoor adjustment on the label-caused bias and reduce the word-caused bias by subtracting the direct causal effect of the word. Through extensive experiments on two professional text classification datasets (e.g., medicine and law), we demonstrate that our method achieves improved classification accuracy along with more coherent attention distributions.

Cite

CITATION STYLE

APA

Wu, Y., Liu, Y., Zhao, Z., Lu, W., Zhang, Y., Sun, C., … Kuang, K. (2024). De-biased Attention Supervision for Text Classification with Causality. In Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 38, pp. 19279–19287). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v38i17.29897

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free