Secant Penalized BFGS: A Noise Robust Quasi-Newton Method Via Penalizing The Secant Condition

10/03/2020
by   Brian Irwin, et al.
0

In this paper, we introduce a new variant of the BFGS method designed to perform well when gradient measurements are corrupted by noise. We show that by treating the secant condition using a penalty method approach, one can smoothly interpolate between updating the inverse Hessian approximation with the original BFGS update formula and not updating the inverse Hessian approximation. Furthermore, we find the curvature condition is smoothly relaxed as the interpolation moves towards not updating the inverse Hessian approximation, disappearing entirely when the inverse Hessian approximation is not updated. These developments allow us to develop an algorithm we refer to as secant penalized BFGS (SP-BFGS) that allows one to relax the secant condition based on the amount of noise in the gradient measurements. Mathematically, SP-BFGS provides a means of incrementally updating the new inverse Hessian approximation with a controlled amount of bias towards the previous inverse Hessian approximation. Practically speaking, this can be used to help replace the overwriting nature of the BFGS update with an averaging nature that resists the destructive effects of noise. We provide a convergence analysis of SP-BFGS, and present numerical results illustrating the performance of SP-BFGS in the presence of noisy gradients.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset