A Semismooth-Newton's-Method-Based Linearization and Approximation Approach for Kernel Support Vector Machines

07/21/2020
by   Chen Jiang, et al.
0

Support Vector Machines (SVMs) are among the most popular and the best performing classification algorithms. Various approaches have been proposed to reduce the high computation and memory cost when training and predicting based on large-scale datasets with kernel SVMs. A popular one is the linearization framework, which successfully builds a bridge between the L_1-loss kernel SVM and the L_1-loss linear SVM. For linear SVMs, very recently, a semismooth Newton's method is proposed. It is shown to be very competitive and have low computational cost. Consequently, a natural question is whether it is possible to develop a fast semismooth Newton's algorithm for kernel SVMs. Motivated by this question and the idea in linearization framework, in this paper, we focus on the L_2-loss kernel SVM and propose a semismooth Newton's method based linearization and approximation approach for it. The main idea of this approach is to first set up an equivalent linear SVM, then apply the Nyström method to approximate the kernel matrix, based on which a reduced linear SVM is obtained. Finally, the fast semismooth Newton's method is employed to solve the reduced linear SVM. We also provide some theoretical analyses on the approximation of the kernel matrix. The advantage of the proposed approach is that it maintains low computational cost and keeps a fast convergence rate. Results of extensive numerical experiments verify the efficiency of the proposed approach in terms of both predicting accuracy and speed.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset