Deep into The Domain Shift: Transfer Learning through Dependence Regularization

IEEE Transactions on Neural Networks and Learning Systems

15 Pages Posted: 14 Jun 2023

See all articles by Shumin Ma

Shumin Ma

BNU-HKBU United International College

Zhiri Yuan

City University of Hong Kong (CityU)

Qi Wu

City University of Hong Kong, School of Data Science

Yiyan Huang

City University of Hong Kong (CityU) - School of Data Science

Xixu Hu

City University of Hong Kong (CityU)

Cheuk Hang Leung

City University of Hong Kong (CityU) - School of Data Science

Dongdong Wang

JD Digits

Zhixiang Huang

JD Digits

Date Written: June 3, 2023

Abstract

Classical Domain Adaptation methods acquire transferability by regularizing the overall distributional discrepancies between features in the source domain (labeled) and features in the target domain (unlabeled). They often do not differentiate whether the domain differences come from the marginals or the dependence structures. In many business and financial applications, the labeling function usually has different sensitivities to the changes in the marginals versus changes in the dependence structures. Measuring the overall distributional differences will not be discriminative enough in acquiring transferability. Without the needed structural resolution, the learned transfer is less optimal. This paper proposes a new domain adaptation approach in which one can measure the differences in the internal dependence structure separately from those in the marginals. By optimizing the relative weights among them, the new regularization strategy greatly relaxes the rigidness of the existing approaches. It allows a learning machine to pay special attention to places where the differences matter the most. Experiments on three real-world datasets show that the improvements are quite notable and robust compared to various benchmark domain adaptation models.

Keywords: domain adaptation, regularization, domain divergence, copula

JEL Classification: C45, C55, G1

Suggested Citation

Ma, Shumin and Yuan, Zhiri and Wu, Qi and Huang, Yiyan and Hu, Xixu and Leung, Cheuk Hang and Wang, Dongdong and Huang, Zhixiang, Deep into The Domain Shift: Transfer Learning through Dependence Regularization (June 3, 2023). IEEE Transactions on Neural Networks and Learning Systems, Available at SSRN: https://ssrn.com/abstract=4468110

Shumin Ma (Contact Author)

BNU-HKBU United International College ( email )

China

Zhiri Yuan

City University of Hong Kong (CityU)

Qi Wu

City University of Hong Kong, School of Data Science ( email )

83 Tat Chee Avenue
Kowloon
Hong Kong

Yiyan Huang

City University of Hong Kong (CityU) - School of Data Science ( email )

Kowloon
Hong Kong

Xixu Hu

City University of Hong Kong (CityU)

Cheuk Hang Leung

City University of Hong Kong (CityU) - School of Data Science ( email )

Kowloon
Hong Kong

Dongdong Wang

JD Digits ( email )

China

Zhixiang Huang

JD Digits ( email )

China

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
20
Abstract Views
181
PlumX Metrics