Universal Robust Regression via Maximum Mean Discrepancy

06/01/2020
by   Pierre Alquier, et al.
0

Many datasets are collected automatically, and are thus easily contaminated by outliers. In order to overcome this issue, there was recently a regain of interest in robust estimation methods. However, most of these methods are designed for a specific purpose, such as estimation of the mean, or linear regression. We propose estimators based on Maximum Mean Discrepancy (MMD) optimization as a universal framework for robust regression. We provide non-asymptotic error bounds, and show that our estimators are robust to Huber-type contamination. We discuss the optimization of the objective functions via (stochastic) gradient descent in classical examples such as linear, logistic or Poisson regression. These results are illustrated by a set of simulations.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset