Nonnegative Spectral Analysis with Adaptive Graph and L_2,0-Norm Regularization for Unsupervised Feature Selection

10/09/2020
by   Zhenzhen Sun, et al.
0

Feature selection is an important data preprocessing in data mining and machine learning, which can reduce feature size without deteriorating model's performance. Since obtaining annotated data is laborious or even infeasible in many cases, unsupervised feature selection is more practical in reality. Although a lots of methods have been proposed, these methods generally cannot determine the number of selected features automatically without using a predefined threshold. In order to get a satisfactory result, it often costs significant time and effort to tune the number of selected features carefully. In this paper, we propose an unsupervised feature selection method which incorporate spectral analysis with a l_2,0 norm regularized term. After optimization, a group of optimal features will be selected, and the number of selected features will be determined automatically. What's more, a nonnegative constraint is imposed to the class indicators to learn more accurate cluster labels, and a graph regularized term is added to learn the similarity matrix adaptively. An efficient and simple iterative algorithm is derived to optimize the problem. Experiments on six different benchmark data sets validate the effectiveness of the proposed approach.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset