Large Scale Kernel Learning using Block Coordinate Descent

02/17/2016
by   Stephen Tu, et al.
0

We demonstrate that distributed block coordinate descent can quickly solve kernel regression and classification problems with millions of data points. Armed with this capability, we conduct a thorough comparison between the full kernel, the Nyström method, and random features on three large classification tasks from various domains. Our results suggest that the Nyström method generally achieves better statistical accuracy than random features, but can require significantly more iterations of optimization. Lastly, we derive new rates for block coordinate descent which support our experimental findings when specialized to kernel methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset