Generalized Likelihood Ratio Test With One-Class Classifiers

10/22/2022
by   Francesco Ardizzon, et al.
0

One-class classification (OCC) is the problem of deciding whether an observed sample belongs to a target class or not. We consider the problem of learning an OCC model when the dataset available at the learning stage contains only samples from the target class. We aim at obtaining a classifier that performs as the generalized likelihood ratio test (GLRT), which is a well-known and provably optimal (under specific assumptions) classifier when the statistic of the target class is available. To this end, we consider both the multilayer perceptron neural network (NN) and the support vector machine (SVM) models. They are trained as two-class classifiers using an artificial dataset for the alternative class, obtained by generating random samples, uniformly over the domain of the target-class dataset. We prove that, under suitable assumptions, the models converge (with a large dataset) to the GLRT. Moreover, we show that the one-class least squares SVM (OCLSSVM) at convergence performs as the GLRT, with a suitable transformation function. Lastly, we compare the obtained solutions with the autoencoder (AE) classifier, which does not in general provide the GLRT

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset