Memory-Constrained Algorithms for Convex Optimization via Recursive Cutting-Planes

06/16/2023
by   Moise Blanchard, et al.
0

We propose a family of recursive cutting-plane algorithms to solve feasibility problems with constrained memory, which can also be used for first-order convex optimization. Precisely, in order to find a point within a ball of radius ϵ with a separation oracle in dimension d – or to minimize 1-Lipschitz convex functions to accuracy ϵ over the unit ball – our algorithms use 𝒪(d^2/pln1/ϵ) bits of memory, and make 𝒪((Cd/pln1/ϵ)^p) oracle calls, for some universal constant C ≥ 1. The family is parametrized by p∈[d] and provides an oracle-complexity/memory trade-off in the sub-polynomial regime ln1/ϵ≫ln d. While several works gave lower-bound trade-offs (impossibility results) – we explicit here their dependence with ln1/ϵ, showing that these also hold in any sub-polynomial regime – to the best of our knowledge this is the first class of algorithms that provides a positive trade-off between gradient descent and cutting-plane methods in any regime with ϵ≤ 1/√(d). The algorithms divide the d variables into p blocks and optimize over blocks sequentially, with approximate separation vectors constructed using a variant of Vaidya's method. In the regime ϵ≤ d^-Ω(d), our algorithm with p=d achieves the information-theoretic optimal memory usage and improves the oracle-complexity of gradient descent.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset