Multi-source Attention for Unsupervised Domain Adaptation

04/14/2020
by   Xia Cui, et al.
0

Domain adaptation considers the problem of generalising a model learnt using data from a particular source domain to a different target domain. Often it is difficult to find a suitable single source to adapt from, and one must consider multiple sources. Using an unrelated source can result in sub-optimal performance, known as the negative transfer. However, it is challenging to select the appropriate source(s) for classifying a given target instance in multi-source unsupervised domain adaptation (UDA). We model source-selection as an attention-learning problem, where we learn attention over sources for a given target instance. For this purpose, we first independently learn source-specific classification models, and a relatedness map between sources and target domains using pseudo-labelled target domain instances. Next, we learn attention-weights over the sources for aggregating the predictions of the source-specific models. Experimental results on cross-domain sentiment classification benchmarks show that the proposed method outperforms prior proposals in multi-source UDA.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset