The Fundamental Limits of Interval Arithmetic for Neural Networks

12/09/2021
by   Matthew Mirman, et al.
0

Interval analysis (or interval bound propagation, IBP) is a popular technique for verifying and training provably robust deep neural networks, a fundamental challenge in the area of reliable machine learning. However, despite substantial efforts, progress on addressing this key challenge has stagnated, calling into question whether interval arithmetic is a viable path forward. In this paper we present two fundamental results on the limitations of interval arithmetic for analyzing neural networks. Our main impossibility theorem states that for any neural network classifying just three points, there is a valid specification over these points that interval analysis can not prove. Further, in the restricted case of one-hidden-layer neural networks we show a stronger impossibility result: given any radius α < 1, there is a set of O(α^-1) points with robust radius α, separated by distance 2, that no one-hidden-layer network can be proven to classify robustly via interval analysis.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset