Domain Divergences: a Survey and Empirical Analysis

10/23/2020
by   Abhinav Ramesh Kashyap, et al.
0

Domain divergence plays a significant role in estimating the performance of a model when applied to new domains. While there is significant literature on divergence measures, choosing an appropriate divergence measures remains difficult for researchers. We address this shortcoming by both surveying the literature and through an empirical study. We contribute a taxonomy of divergence measures consisting of three groups – Information-theoretic, Geometric, and Higher-order measures – and identify the relationships between them. We then ground the use of divergence measures in three different application groups – 1) Data Selection, 2) Learning Representation, and 3) Decisions in the Wild. From this, we identify that Information-theoretic measures are prevalent for 1) and 3), and higher-order measures are common for 2). To further help researchers, we validate these uses empirically through a correlation analysis of performance drops. We consider the current contextual word representations (CWR) to contrast with the older word distribution based representations for this analysis. We find that traditional measures over word distributions still serve as strong baselines, while higher-order measures with CWR are effective.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset