Accelerating Multigrid Optimization via SESOP

12/17/2018
by   Tao Hong, et al.
0

A merger of two optimization frameworks is introduced: SEquential Subspace OPtimization (SESOP) with the MultiGrid (MG) optimization. At each iteration of the combined algorithm, search directions implied by the coarse-grid correction process of MG are added to the low dimensional search-spaces of SESOP, which include the (preconditioned) gradient and search directions involving the previous iterates (so-called history). The resulting accelerated technique is called SESOP-MG. The black asymptotic convergence rate of the two-level version of SESOP-MG (dubbed SESOP-TG) is studied via Fourier mode analysis for linear problems (i.e., optimization of quadratic functionals). Numerical tests on linear and nonlinear black problems demonstrate the effectiveness of the approach.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset