Abstract

Cross-domain learning methods have shown promising results by leveraging labeled patterns from the auxiliary domain to learn a robust classifier for the target domain which has only a limited number of labeled samples. To cope with the considerable change between feature distributions of different domains, we propose a new cross-domain kernel learning framework into which many existing kernel methods can be readily incorporated. Our framework, referred to as Domain Transfer Multiple Kernel Learning (DTMKL), simultaneously learns a kernel function and a robust classifier by minimizing both the structural risk functional and the distribution mismatch between the labeled and unlabeled samples from the auxiliary and target domains. Under the DTMKL framework, we also propose two novel methods by using SVM and prelearned classifiers, respectively. Comprehensive experiments on three domain adaptation data sets (i.e., TRECVID, 20 Newsgroups, and email spam data sets) demonstrate that DTMKL-based methods outperform existing cross-domain learning and multiple kernel learning methods.

Keywords

Artificial intelligenceComputer scienceMultiple kernel learningDomain adaptationKernel (algebra)Classifier (UML)Tree kernelTransfer of learningPattern recognition (psychology)Machine learningRadial basis function kernelKernel methodSupport vector machineDomain (mathematical analysis)Graph kernelKernel embedding of distributionsLabeled dataMathematics

Affiliated Institutions

Related Publications

Publication Info

Year
2011
Type
article
Volume
34
Issue
3
Pages
465-479
Citations
568
Access
Closed

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

568
OpenAlex
43
Influential
420
CrossRef

Cite This

Lixin Duan, Ivor W. Tsang, Dong Xu (2011). Domain Transfer Multiple Kernel Learning. IEEE Transactions on Pattern Analysis and Machine Intelligence , 34 (3) , 465-479. https://doi.org/10.1109/tpami.2011.114

Identifiers

DOI
10.1109/tpami.2011.114
PMID
21646679

Data Quality

Data completeness: 81%