A new unified framework for designing convex optimization methods with prescribed theoretical convergence estimates: A numerical analysis approach

02/15/2023
by   Kansei Ushiyama, et al.
0

We propose a new unified framework for describing and designing gradient-based convex optimization methods from a numerical analysis perspective. There the key is the new concept of weak discrete gradients (weak DGs), which is a generalization of DGs standard in numerical analysis. Via weak DG, we consider abstract optimization methods, and prove unified convergence rate estimates that hold independent of the choice of weak DGs except for some constants in the final estimate. With some choices of weak DGs, we can reproduce many popular existing methods, such as the steepest descent and Nesterov's accelerated gradient method, and also some recent variants from numerical analysis community. By considering new weak DGs, we can easily explore new theoretically-guaranteed optimization methods; we show some examples. We believe this work is the first attempt to fully integrate research branches in optimization and numerical analysis areas, so far independently developed.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset