Accelerated Sparse Recovery via Gradient Descent with Nonlinear Conjugate Gradient Momentum

08/25/2022
by   Mengqi Hu, et al.
0

This paper applies an idea of adaptive momentum for the nonlinear conjugate gradient to accelerate optimization problems in sparse recovery. Specifically, we consider two types of minimization problems: a (single) differentiable function and the sum of a non-smooth function and a differentiable function. In the first case, we adopt a fixed step size to avoid the traditional line search and establish the convergence analysis of the proposed algorithm for a quadratic problem. This acceleration is further incorporated with an operator splitting technique to deal with the non-smooth function in the second case. We use the convex ℓ_1 and the nonconvex ℓ_1-ℓ_2 functionals as two case studies to demonstrate the efficiency of the proposed approaches over traditional methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset