Fast Algorithms for ℓ_p-Regression
The ℓ_p-norm regression problem is a classic problem in optimization with wide ranging applications in machine learning and theoretical computer science. The goal is to compute x^⋆ =min_Ax=bx_p^p, where x^⋆∈ℝ^n, A∈ℝ^d× n,b ∈ℝ^d and d≤ n. Efficient high-accuracy algorithms for the problem have been challenging both in theory and practice and the state of the art algorithms require poly(p)· n^1/2-1/p linear system solves for p≥ 2. In this paper, we provide new algorithms for ℓ_p-regression (and a more general formulation of the problem) that obtain a high-accuracy solution in O(p n^(p-2)/(3p-2)) linear system solves. We further propose a new inverse maintenance procedure that speeds-up our algorithm to O(n^ω) total runtime, where O(n^ω) denotes the running time for multiplying n × n matrices. Additionally, we give the first Iteratively Reweighted Least Squares (IRLS) algorithm that is guaranteed to converge to an optimum in a few iterations. Our IRLS algorithm has shown exceptional practical performance, beating the currently available implementations in MATLAB/CVX by 10-50x.
READ FULL TEXT