How Robust is the Median-of-Means? Concentration Bounds in Presence of Outliers
In contrast to the empirical mean, the Median-of-Means (MoM) is an estimator of the mean θ of a square integrable r.v. Z, around which accurate nonasymptotic confidence bounds can be built, even when Z does not exhibit a sub-Gaussian tail behavior. Because of the high confidence it achieves when applied to heavy-tailed data, MoM has recently found applications in statistical learning, in order to design training procedures that are not sensitive to atypical nor corrupted observations. For the first time, we provide concentration bounds for the MoM estimator in presence of outliers, that depend explicitly on the fraction of contaminated data present in the sample. These results are also extended to "Medians-of-U-statistics” (i.e. averages over tuples of observations), and are shown to furnish generalization guarantees for pairwise learning techniques (e.g. ranking, metric learning) based on contaminated training data. Beyond the theoretical analysis carried out, numerical results are displayed, that provide strong empirical evidence of the robustness properties claimed by the learning rate bounds established.
READ FULL TEXT