Sparse random tensors: concentration, regularization and applications
We prove a non-asymptotic concentration inequality of sparse inhomogeneous random tensors under the spectral norm. For an order-k inhomogeneous random tensor T with sparsity p_max≥clog n/n^k-1, we show that T-E T=O(√(np_max)) with high probability. We also provide a simple way to regularize T such that O(√(np_max)) concentration still holds down to sparsity p_max>c/n^k-1. Our proofs are based on the techniques of Friedman, Kahn and Szemerédi (1989), Feige and Ofek (2005), with the discrepancy theory of random hypergraphs. We also show that our concentration inequality is rate optimal in the minimax sense. We present our concentration and regularization results with three applications: (i) a randomized construction of hypergraphs of bounded degrees with good expander mixing properties, (ii) concentration of the adjacency matrices for sparse random hypergraphs, and (iii) concentration of sparsified tensors under uniform sampling.
READ FULL TEXT