Approximate Euclidean lengths and distances beyond Johnson-Lindenstrauss

05/24/2022
by   Aleksandros Sobczyk, et al.
0

A classical result of Johnson and Lindenstrauss states that a set of n high dimensional data points can be projected down to O(log n/ϵ^2) dimensions such that the square of their pairwise distances is preserved up to a small distortion ϵ∈(0,1). It has been proved that the JL lemma is optimal for the general case, therefore, improvements can only be explored for special cases. This work aims to improve the ϵ^-2 dependency based on techniques inspired by the Hutch++ Algorithm , which reduces ϵ^-2 to ϵ^-1 for the related problem of implicit matrix trace estimation. For ϵ=0.01, for example, this translates to 100 times less matrix-vector products in the matrix-vector query model to achieve the same accuracy as other previous estimators. We first present an algorithm to estimate the Euclidean lengths of the rows of a matrix. We prove element-wise probabilistic bounds that are at least as good as standard JL approximations in the worst-case, but are asymptotically better for matrices with decaying spectrum. Moreover, for any matrix, regardless of its spectrum, the algorithm achieves ϵ-accuracy for the total, Frobenius norm-wise relative error using only O(ϵ^-1) queries. This is a quadratic improvement over the norm-wise error of standard JL approximations. We finally show how these results can be extended to estimate the Euclidean distances between data points and to approximate the statistical leverage scores of a tall-and-skinny data matrix, which are ubiquitous for many applications. Proof-of-concept numerical experiments are presented to validate the theoretical analysis.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset