Space lower bounds for linear prediction

02/09/2019
by   Yuval Dagan, et al.
0

We show that fundamental learning tasks, such as finding an approximate linear separator or linear regression, require memory at least quadratic in the dimension, in a natural streaming setting. This implies that such problems cannot be solved (at least in this setting) by scalable memory-efficient streaming algorithms. Our results build on a memory lower bound for a simple linear-algebraic problem -- finding orthogonal vectors -- and utilize the estimates on the packing of the Grassmannian, the manifold of all linear subspaces of fixed dimension.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset