Preconditioned accelerated gradient descent methods for locally Lipschitz smooth objectives with applications to the solution of nonlinear PDEs

06/11/2020
by   Jea-Hyun Park, et al.
0

We analyze preconditioned Nesterov's accelerated gradient descent methods (PAGD) for approximating the minimizer of locally Lipschitz smooth, strongly convex objective functionals. To facilitate our analysis, we introduce a second-order ordinary differential equation (ODE) and demonstrate that this ODE is the limiting case of PAGD as the step size tends to zero. Using a simple energy argument, we show an exponential convergence of the ODE solution to its steady state. The PAGD method may be viewed as an explicit-type time-discretization scheme of the ODE system, which requires a natural time step restriction for energy stability. Assuming this restriction, an exponential rate of convergence of the PAGD sequence is demonstrated, by mimicking the convergence of the solution to the ODE via energy methods. Application of the PAGD method is made in the context of solving certain nonlinear elliptic PDE using pseudo-spectral methods, and several numerical experiments are conducted. The results confirm the global geometric and h-independent convergence of the PAGD method, with an accelerated rate that is improved over the preconditioned gradient descent (PGD) method.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset