Class Is Invariant to Context and Vice Versa: On Learning Invariance for Out-Of-Distribution Generalization

3Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Out-Of-Distribution generalization (OOD) is all about learning invariance against environmental changes. If the context (In this paper, the word “context” denotes any class-agnostic attributes such as color, texture and background. The formal definition can be found in Appendix, A.2.) in every class is evenly distributed, OOD would be trivial because the context can be easily removed due to an underlying principle: class is invariant to context. However, collecting such a balanced dataset is impractical. Learning on imbalanced data makes the model bias to context and thus hurts OOD. Therefore, the key to OOD is context balance. We argue that the widely adopted assumption in prior work—the context bias can be directly annotated or estimated from biased class prediction—renders the context incomplete or even incorrect. In contrast, we point out the ever-overlooked other side of the above principle: context is also invariant to class, which motivates us to consider the classes (which are already labeled) as the varying environments (The word “environments” [2] denotes the subsets of training data built by some criteria. In this paper, we take a class as an environment—our key idea.) to resolve context bias (without context labels). We implement this idea by minimizing the contrastive loss of intra-class sample similarity while assuring this similarity to be invariant across all classes. On benchmarks with various context biases and domain gaps, we show that a simple re-weighting based classifier equipped with our context estimation achieves state-of-the-art performance. We provide the theoretical justifications in Appendix and codes on Github: https://github.com/simpleshinobu/IRMCon.

Cite

CITATION STYLE

APA

Qi, J., Tang, K., Sun, Q., Hua, X. S., & Zhang, H. (2022). Class Is Invariant to Context and Vice Versa: On Learning Invariance for Out-Of-Distribution Generalization. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13685 LNCS, pp. 92–109). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-19806-9_6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free