Randomized Fast Subspace Descent Methods

06/11/2020
by   Long Chen, et al.
0

Randomized Fast Subspace Descent (RFASD) Methods are developed and analyzed for smooth and non-constraint convex optimization problems. The efficiency of the method relies on a space decomposition which is stable in A-norm, and meanwhile, the condition number κ_A measured in A-norm is small. At each iteration, the subspace is chosen randomly either uniformly or by a probability proportional to the local Lipschitz constants. Then in each chosen subspace, a preconditioned gradient descent method is applied. RFASD converges sublinearly for convex functions and linearly for strongly convex functions. Comparing with the randomized block coordinate descent methods, the convergence of RFASD is faster provided κ_A is small and the subspace decomposition is A-stable. This improvement is supported by considering a multilevel space decomposition for Nesterov's `worst' problem.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset