Learning with Correntropy-induced Losses for Regression with Mixture of Symmetric Stable Noise

03/01/2018
by   Yunlong Feng, et al.
0

In recent years, correntropy and its applications in machine learning have been drawing continuous attention owing to its merits in dealing with non-Gaussian noise and outliers. However, theoretical understanding of correntropy, especially in the statistical learning context, is still limited. In this study, within the statistical learning framework, we investigate correntropy based regression in the presence of non-Gaussian noise or outliers. To this purpose, we first introduce mixture of symmetric stable noise, which include Gaussian noise, Cauchy noise, and the mixture of Gaussian noise as special cases, to model non-Gaussian noise and outliers. We demonstrate that under the mixture of symmetric stable noise assumption, correntropy based regression can learn the conditional mean function or the conditional median function well without requiring the finite variance assumption of the noise. In particular, we establish learning rates for correntropy based regression estimators that are asymptotically of type O(n^-1). We believe that the present study completes our understanding towards correntropy based regression from a statistical learning viewpoint, and may also shed some light on robust statistical learning for regression.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset