Contrastive Adaptation Network for Unsupervised Domain Adaptation

Guoliang Kang
Lu Jiang
Yi Yang
Alexander Hauptmann
CVPR(2019)

Abstract

Unsupervised Domain Adaptation (UDA) makes predictions for the target domain data while manual annotations are only available in the source domain.Previous methods minimize the domain discrepancy neglecting the class information, and such misalignment may lead to poor generalization performance. To address this issue, this paper proposes Contrastive Adaptation Network (CAN) optimizing a new metric which explicitly models the intra-class domain discrepancy and the inter-class domain discrepancy. We discuss an alternating update strategy for training CAN in an end-to-end manner. Experiments on two real-world benchmarks Office-31 and VisDA-2017 demonstrate that CAN performs favorably against the state-of-the-art methods and produces more discriminative features.