An homotopy method for ℓ_p regression provably beyond self-concordance and in input-sparsity time

11/03/2017
by   Sébastien Bubeck, et al.
0

We consider the problem of linear regression where the ℓ_2^n norm loss (i.e., the usual least squares loss) is replaced by the ℓ_p^n norm. We show how to solve such problems up to machine precision in O^*(n^|1/2 - 1/p|) (dense) matrix-vector products and O^*(1) matrix inversions, or alternatively in O^*(n^|1/2 - 1/p|) calls to a (sparse) linear system solver. This improves the state of the art for any p∈{1,2,+∞}. Furthermore we also propose a randomized algorithm solving such problems in input sparsity time, i.e., O^*(Z + poly(d)) where Z is the size of the input and d is the number of variables. Such a result was only known for p=2. Finally we prove that these results lie outside the scope of the Nesterov-Nemirovski's theory of interior point methods by showing that any symmetric self-concordant barrier on the ℓ_p^n unit ball has self-concordance parameter Ω̃(n).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset