From Smooth Wasserstein Distance to Dual Sobolev Norm: Empirical Approximation and Statistical Applications
Statistical distances, i.e., discrepancy measures between probability distributions, are ubiquitous in probability theory, statistics and machine learning. To combat the curse of dimensionality when estimating these distances from data, recent work has proposed smoothing out local irregularities in the measured distributions via convolution with a Gaussian kernel. Motivated by the scalability of the smooth framework to high dimensions, we conduct an in-depth study of the structural and statistical behavior of the Gaussian-smoothed p-Wasserstein distance š¶_p^(Ļ), for arbitrary pā„ 1. We start by showing that š¶_p^(Ļ) admits a metric structure that is topologically equivalent to classic š¶_p and is stable with respect to perturbations in Ļ. Moving to statistical questions, we explore the asymptotic properties of š¶_p^(Ļ)(Ī¼Ģ_n,Ī¼), where Ī¼Ģ_n is the empirical distribution of n i.i.d. samples from Ī¼. To that end, we prove that š¶_p^(Ļ) is controlled by a pth order smooth dual Sobolev norm š½_p^(Ļ). Since š½_p^(Ļ)(Ī¼Ģ_n,Ī¼) coincides with the supremum of an empirical process indexed by Gaussian-smoothed Sobolev functions, it lends itself well to analysis via empirical process theory. We derive the limit distribution of ā(n)š½_p^(Ļ)(Ī¼Ģ_n,Ī¼) in all dimensions d, when Ī¼ is sub-Gaussian. Through the aforementioned bound, this implies a parametric empirical convergence rate of n^-1/2 for š¶_p^(Ļ), contrasting the n^-1/d rate for unsmoothed š¶_p when d ā„ 3. As applications, we provide asymptotic guarantees for two-sample testing and minimum distance estimation. When p=2, we further show that š½_2^(Ļ) can be expressed as a maximum mean discrepancy.
READ FULL TEXT