Follow the Compressed Leader: Faster Online Learning of Eigenvectors and Faster MMWU

01/06/2017
by   Zeyuan Allen-Zhu, et al.
0

The online problem of computing the top eigenvector is fundamental to machine learning. In both adversarial and stochastic settings, previous results (such as matrix multiplicative weight update, follow the regularized leader, follow the compressed leader, block power method) either achieve optimal regret but run slow, or run fast at the expense of loosing a √(d) factor in total regret where d is the matrix dimension. We propose a follow-the-compressed-leader (FTCL) framework which achieves optimal regret without sacrificing the running time. Our idea is to "compress" the matrix strategy to dimension 3 in the adversarial setting, or dimension 1 in the stochastic setting. These respectively resolve two open questions regarding the design of optimal and efficient algorithms for the online eigenvector problem.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset