Faster Convex Lipschitz Regression via 2-block ADMM

11/02/2021
by   Ali Siahkamari, et al.
0

The task of approximating an arbitrary convex function arises in several learning problems such as convex regression, learning with a difference of convex (DC) functions, and approximating Bregman divergences. In this paper, we show how a broad class of convex function learning problems can be solved via a 2-block ADMM approach, where updates for each block can be computed in closed form. For the task of convex Lipschitz regression, we establish that our proposed algorithm converges at the rate of O(n^3 d^1.5+n^2 d^2.5+n d^3) for a dataset X ∈ R^n× d. This new rate improves the state of the art O(n^5d^2) available by interior point methods if d = o( n^4). Further we provide similar solvers for DC regression and Bregman divergence learning. Unlike previous approaches, our method is amenable to the use of GPUs. We demonstrate on regression and metric learning experiments that our approach is up to 20 times faster than the existing method, and produces results that are comparable to state-of-the-art.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset