Metric learning in optimal transport for domain adaptation

17Citations
Citations of this article
37Readers
Mendeley users who have this article in their library.

Abstract

Domain Adaptation aims at benefiting from a labeled dataset drawn from a source distribution to learn a model from examples generated according to a different but related target distribution. Creating a domain-invariant representation between the two source and target domains is the most widely technique used. A simple and robust way to perform this task consists in (i) representing the two domains by subspaces described by their respective eigenvectors and (ii) seeking a mapping function which aligns them. In this paper, we propose to use Optimal Transport (OT) and its associated Wasserstein distance to perform this alignment. While the idea of using OT in domain adaptation is not new, the original contribution of this paper is two-fold: (i) we derive a generalization bound on the target error involving several Wasserstein distances. This prompts us to optimize the ground metric of OT to reduce the target risk. (ii) From this theoretical analysis, we design an algorithm (MLOT) which optimizes a Mahalanobis distance leading to a transportation plan that adapts better. Experiments demonstrate the effectiveness of this original approach.

Cite

CITATION STYLE

APA

Kerdoncuff, T., Emonet, R., & Sebban, M. (2020). Metric learning in optimal transport for domain adaptation. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2021-January, pp. 2162–2168). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2020/299

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free