Ordinal Unsupervised Domain Adaptation With Recursively Conditional Gaussian Imposed Variational Disentanglement

4Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

There has been a growing interest in unsupervised domain adaptation (UDA) to alleviate the data scalability issue, while the existing works usually focus on classifying independently discrete labels. However, in many tasks (e.g., medical diagnosis), the labels are discrete and successively distributed. The UDA for ordinal classification requires inducing non-trivial ordinal distribution prior to the latent space. Target for this, the partially ordered set (poset) is defined for constraining the latent vector. Instead of the typically i.i.d. Gaussian latent prior, in this work, a recursively conditional Gaussian (RCG) set is proposed for ordered constraint modeling, which admits a tractable joint distribution prior. Furthermore, we are able to control the density of content vectors that violate the poset constraint by a simple “three-sigma rule.” We explicitly disentangle the cross-domain images into a shared ordinal prior induced ordinal content space and two separate source/target ordinal-unrelated spaces, and the self-training is worked on the shared space exclusively for ordinal-aware domain alignment. Extensive experiments on UDA medical diagnoses and facial age estimation demonstrate its effectiveness.

Cite

CITATION STYLE

APA

Liu, X., Li, S., Ge, Y., Ye, P., You, J., & Lu, J. (2025). Ordinal Unsupervised Domain Adaptation With Recursively Conditional Gaussian Imposed Variational Disentanglement. IEEE Transactions on Pattern Analysis and Machine Intelligence, 47(5), 3219–3232. https://doi.org/10.1109/TPAMI.2022.3183115

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free