One-Class Kernel Spectral Regression for Outlier Detection

The paper introduces a new efficient nonlinear one-class classifier formulated as the Rayleigh quotient criterion. The method, operating in a reproducing kernel Hilbert subspace, minimises the scatter of target distribution along an optimal projection direction while at the same time keeping projections of target observations as distant as possible from the origin which serves as an artificial outlier with respect to the data. We provide a graph embedding view of the problem which can then be solved efficiently using the spectral regression approach. In this sense, unlike previous similar methods which often require costly eigen-computations of dense matrices, the proposed approach casts the problem under consideration into a regression framework which avoids eigen-decomposition computations. In particular, it is shown that the dominant complexity of the proposed method is the complexity of computing the kernel matrix. Additional appealing characteristics of the proposed one-class classifier are: 1-the ability to be trained in an incremental fashion (allowing for application in streaming data scenarios while also reducing computational complexity in the non-streaming operation mode); 2-being unsupervised while also providing the ability for the user to specify the expected fraction of outliers in the training set in advance; And last but not least 3-the deployment of the kernel trick allowing for a large class of functions by nonlinearly mapping the data into a high-dimensional feature space. Extensive experiments conducted on several datasets verifies the merits of the proposed approach in comparison with some other alternatives.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset