Sharp Statistical Guarantees for Adversarially Robust Gaussian Classification

06/29/2020
by   Chen Dan, et al.
9

Adversarial robustness has become a fundamental requirement in modern machine learning applications. Yet, there has been surprisingly little statistical understanding so far. In this paper, we provide the first result of the optimal minimax guarantees for the excess risk for adversarially robust classification, under Gaussian mixture model proposed by <cit.>. The results are stated in terms of the Adversarial Signal-to-Noise Ratio (AdvSNR), which generalizes a similar notion for standard linear classification to the adversarial setting. For the Gaussian mixtures with AdvSNR value of r, we establish an excess risk lower bound of order Θ(e^-(1/8+o(1)) r^2d/n) and design a computationally efficient estimator that achieves this optimal rate. Our results built upon minimal set of assumptions while cover a wide spectrum of adversarial perturbations including ℓ_p balls for any p > 1.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset