Abstract

Transfer learning is established as an effective technology in computer vision for leveraging rich labeled data in the source domain to build an accurate classifier for the target domain. However, most prior methods have not simultaneously reduced the difference in both the marginal distribution and conditional distribution between domains. In this paper, we put forward a novel transfer learning approach, referred to as Joint Distribution Adaptation (JDA). Specifically, JDA aims to jointly adapt both the marginal distribution and conditional distribution in a principled dimensionality reduction procedure, and construct new feature representation that is effective and robust for substantial distribution difference. Extensive experiments verify that JDA can significantly outperform several state-of-the-art methods on four types of cross-domain image classification problems.

Keywords

Computer scienceMarginal distributionJoint probability distributionTransfer of learningArtificial intelligenceClassifier (UML)Conditional probability distributionDimensionality reductionDomain adaptationPattern recognition (psychology)Feature (linguistics)Curse of dimensionalityFeature learningDomain (mathematical analysis)Feature vectorMachine learningMathematicsStatistics

Affiliated Institutions

Related Publications

A Survey on Multi-Task Learning

Multi-Task Learning (MTL) is a learning paradigm in machine learning and its aim is to leverage useful information contained in multiple related tasks to help improve the genera...

2021 IEEE Transactions on Knowledge and Da... 1864 citations

Publication Info

Year
2013
Type
article
Pages
2200-2207
Citations
1984
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

1984
OpenAlex

Cite This

Mingsheng Long, Jianmin Wang, Guiguang Ding et al. (2013). Transfer Feature Learning with Joint Distribution Adaptation. , 2200-2207. https://doi.org/10.1109/iccv.2013.274

Identifiers

DOI
10.1109/iccv.2013.274