Category Prior-Guided Unsupervised Domain Adaptation
Access status:
USyd Access
Type
ThesisThesis type
Masters by ResearchAuthor/s
Zhang, QimingAbstract
Unsupervised domain adaptation (UDA) aims to adapt the generalization of a model from the source domain with fundamental annotations to the target domain without any labels. The importance of UDA sees in the difficulty in obtaining enough fully-annotated labels for the specific ...
See moreUnsupervised domain adaptation (UDA) aims to adapt the generalization of a model from the source domain with fundamental annotations to the target domain without any labels. The importance of UDA sees in the difficulty in obtaining enough fully-annotated labels for the specific scenario within a short time, especially in deep learning where a large number of data are required to train the neural networks. Due to the large difference in data distributions between two domains, although there has been big progress in matching the data distributions and reducing the domain gaps, the UDA problem remains challenging. Specifically, after category-agnostic alignment in previous works, the encoded feature distributions of each category from the target domain could not be overlapped entirely by those from the source domain, and the classifier layers in the network still favor the source domain features. The classifier is thus not able to distinguish the target domain features well and makes incorrect predictions for each category on target domain data. To avoid the bad effect of biased classifiers, we proposed a novel category prior-guided (CPG) unsupervised domain adaptation method, which tries to explicitly extract category-prior knowledge from source domain features and enforces category-aware features alignment. Then, the category-discriminative features are captured by our model and the classifier layers are corrected under the guidance of the category-prior information. We also conducted comprehensive experiments, and the results demonstrate the superiority of the proposed CPG model over the state-of-the-art methods.
See less
See moreUnsupervised domain adaptation (UDA) aims to adapt the generalization of a model from the source domain with fundamental annotations to the target domain without any labels. The importance of UDA sees in the difficulty in obtaining enough fully-annotated labels for the specific scenario within a short time, especially in deep learning where a large number of data are required to train the neural networks. Due to the large difference in data distributions between two domains, although there has been big progress in matching the data distributions and reducing the domain gaps, the UDA problem remains challenging. Specifically, after category-agnostic alignment in previous works, the encoded feature distributions of each category from the target domain could not be overlapped entirely by those from the source domain, and the classifier layers in the network still favor the source domain features. The classifier is thus not able to distinguish the target domain features well and makes incorrect predictions for each category on target domain data. To avoid the bad effect of biased classifiers, we proposed a novel category prior-guided (CPG) unsupervised domain adaptation method, which tries to explicitly extract category-prior knowledge from source domain features and enforces category-aware features alignment. Then, the category-discriminative features are captured by our model and the classifier layers are corrected under the guidance of the category-prior information. We also conducted comprehensive experiments, and the results demonstrate the superiority of the proposed CPG model over the state-of-the-art methods.
See less
Date
2020-01-01Licence
The author retains copyright of this thesis. It may only be used for the purposes of research and study. It must not be used for any other purposes and may not be transmitted or shared with others without prior permission.Faculty/School
Faculty of Engineering, School of Computer ScienceAwarding institution
The University of SydneyShare