Optimal Extragradient-Based Bilinearly-Coupled Saddle-Point Optimization
We consider the smooth convex-concave bilinearly-coupled saddle-point problem, min_𝐱max_𝐲 F(𝐱) + H(𝐱,𝐲) - G(𝐲), where one has access to stochastic first-order oracles for F, G as well as the bilinear coupling function H. Building upon standard stochastic extragradient analysis for variational inequalities, we present a stochastic accelerated gradient-extragradient (AG-EG) descent-ascent algorithm that combines extragradient and Nesterov's acceleration in general stochastic settings. This algorithm leverages scheduled restarting to admit a fine-grained nonasymptotic convergence rate that matches known lower bounds by both <cit.> and <cit.> in their corresponding settings, plus an additional statistical error term for bounded stochastic noise that is optimal up to a constant prefactor. This is the first result that achieves such a relatively mature characterization of optimality in saddle-point optimization.
READ FULL TEXT