Local convergence of alternating low-rank optimization methods with overrelaxation

11/29/2021
by   Ivan V. Oseledets, et al.
0

The local convergence of alternating optimization methods with overrelaxation for low-rank matrix and tensor problems is established. The analysis is based on the linearization of the method which takes the form of an SOR iteration for a positive semidefinite Hessian and can be studied in the corresponding quotient geometry of equivalent low-rank representations. In the matrix case, the optimal relaxation parameter for accelerating the local convergence can be determined from the convergence rate of the standard method. This result relies on a version of Young's SOR theorem for positive semidefinite 2 × 2 block systems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset