Projection-Free Online Convex Optimization via Efficient Newton Iterations

06/19/2023
by   Khashayar Gatmiry, et al.
0

This paper presents new projection-free algorithms for Online Convex Optimization (OCO) over a convex domain 𝒦⊂ℝ^d. Classical OCO algorithms (such as Online Gradient Descent) typically need to perform Euclidean projections onto the convex set to ensure feasibility of their iterates. Alternative algorithms, such as those based on the Frank-Wolfe method, swap potentially-expensive Euclidean projections onto 𝒦 for linear optimization over 𝒦. However, such algorithms have a sub-optimal regret in OCO compared to projection-based algorithms. In this paper, we look at a third type of algorithms that output approximate Newton iterates using a self-concordant barrier for the set of interest. The use of a self-concordant barrier automatically ensures feasibility without the need for projections. However, the computation of the Newton iterates requires a matrix inverse, which can still be expensive. As our main contribution, we show how the stability of the Newton iterates can be leveraged to compute the inverse Hessian only a vanishing fraction of the rounds, leading to a new efficient projection-free OCO algorithm with a state-of-the-art regret bound.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset