Scalar and Matrix Chernoff Bounds from ℓ_∞-Independence
We present new scalar and matrix Chernoff-style concentration bounds for a broad class of probability distributions over the binary hypercube {0,1}^n. Motivated by recent tools developed for the study of mixing times of Markov chains on discrete distributions, we say that a distribution is ℓ_∞-independent when the infinity norm of its influence matrix ℐ is bounded by a constant. We show that any distribution which is ℓ_∞-independent satisfies a matrix Chernoff bound that matches the matrix Chernoff bound for independent random variables due to Tropp. Our matrix Chernoff bound is a broad generalization and strengthening of the matrix Chernoff bound of Kyng and Song (FOCS'18). Using our bound, we can conclude as a corollary that a union of O(log|V|) random spanning trees gives a spectral graph sparsifier of a graph with |V| vertices with high probability, matching results for independent edge sampling, and matching lower bounds from Kyng and Song.
READ FULL TEXT