On the robustness of minimum-norm interpolators
This article develops a general theory for minimum-norm interpolated estimators in linear models in the presence of additive, potentially adversarial, errors. In particular, no conditions on the errors are imposed. A quantitative bound for the prediction error is given, relating it to the Rademacher complexity of the covariates, the norm of the minimum norm interpolator of the errors and the shape of the subdifferential around the true parameter. The general theory is illustrated with several examples: the sparse linear model with minimum ℓ_1-norm or group Lasso penalty interpolation, the low rank trace regression model with nuclear norm minimization, and minimum Euclidean norm interpolation in the linear model. In case of sparsity or low-rank inducing norms, minimum norm interpolation yields a prediction error of the order of the average noise level, provided that the overparameterization is at least a logarithmic factor larger than the number of samples. Lower bounds that show near optimality of the results complement the analysis.
READ FULL TEXT