Transfer Learning under Conditional Shift Based on Fuzzy Residual

10Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

Transfer learning has received much attention recently and has been proven to be effective in a wide range of applications, whereas studies on regression problems are still scarce. In this article, we focus on the transfer learning problem for regression under the situations of conditional shift where the source and target domains share the same marginal distribution while having different conditional probability distributions. We propose a new framework called transfer learning based on fuzzy residual (ResTL) which learns the target model by preserving the distribution properties of the source data in a model-Agnostic way. First, we formulate the target model by adding fuzzy residual to a model-Agnostic source model and reuse the antecedent parameters of the source fuzzy system. Then two methods for bias computation are provided for different considerations, which refer to two ResTL methods called ResTLLS and ResTLRD. Finally, we conduct a series of experiments both on a toy example and several real-world datasets to verify the effectiveness of the proposed method.

Cite

CITATION STYLE

APA

Chen, G., Li, Y., & Liu, X. (2022). Transfer Learning under Conditional Shift Based on Fuzzy Residual. IEEE Transactions on Cybernetics, 52(2), 960–970. https://doi.org/10.1109/TCYB.2020.2988277

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free