An Equivalence Class for Orthogonal Vectors
The Orthogonal Vectors problem (OV) asks: given n vectors in {0,1}^O( n), are two of them orthogonal? OV is easily solved in O(n^2 n) time, and it is a central problem in fine-grained complexity: dozens of conditional lower bounds are based on the popular hypothesis that OV cannot be solved in (say) n^1.99 time. However, unlike the APSP problem, few other problems are known to be non-trivially equivalent to OV. We show OV is truly-subquadratic equivalent to several fundamental problems, all of which (a priori) look harder than OV. A partial list is given below: (Min-IP/Max-IP) Find a red-blue pair of vectors with minimum (respectively, maximum) inner product, among n vectors in {0,1}^O( n). (Exact-IP) Find a red-blue pair of vectors with inner product equal to a given target integer, among n vectors in {0,1}^O( n). (Apx-Min-IP/Apx-Max-IP) Find a red-blue pair of vectors that is a 100-approximation to the minimum (resp. maximum) inner product, among n vectors in {0,1}^O( n). (Approx. Bichrom.-ℓ_p-Closest-Pair) Compute a (1 + Ω(1))-approximation to the ℓ_p-closest red-blue pair (for a constant p ∈ [1,2]), among n points in R^d, d < n^o(1). (Approx. ℓ_p-Furthest-Pair) Compute a (1 + Ω(1))-approximation to the ℓ_p-furthest pair (for a constant p ∈ [1,2]), among n points in R^d, d < n^o(1). We also show that there is a poly(n) space, n^1-ϵ query time data structure for Partial Match with vectors from {0,1}^O( n) if and only if such a data structure exists for 1+Ω(1) Approximate Nearest Neighbor Search in Euclidean space.
READ FULL TEXT