From Soft-Minoration to Information-Constrained Optimal Transport and Spiked Tensor Models
Let P_Z be a given distribution on ℝ^n. For any y∈ℝ^n, we may interpret ρ(y):=ln𝔼[e^<y,Z>] as a soft-max of <y,Z>. We explore lower bounds on 𝔼[ρ(Y)] in terms of the minimum mutual information I(Z,Z̅) over P_ZZ̅ which is a coupling of P_Z and itself such that Z-Z̅ is bounded in a certain sense. This may be viewed as a soft version of Sudakov's minoration, which lower bounds the expected supremum of a stochastic process in terms of the packing number. Our method is based on convex geometry (thrifty approximation of convex bodies), and works for general non-Gaussian Y. When Y is Gaussian and Z̅ converges to Z, this recovers a recent inequality of Bai-Wu-Ozgur on information-constrained optimal transport, previously established using Gaussian-specific techniques. We also use soft-minoration to obtain asymptotically (in tensor order) tight bounds on the free energy in the Sherrington-Kirkpatrick model with spins uniformly distributed on a type class, implying asymptotically tight bounds for the type II error exponent in spiked tensor detection.
READ FULL TEXT