On the stability of deep convolutional neural networks under irregular or random deformations

04/24/2021
by   Fabio Nicola, et al.
0

The problem of robustness under location deformations for deep convolutional neural networks (DCNNs) is of great theoretical and practical interest. This issue has been studied in pioneering works, especially for scattering-type architectures, for deformation vector fields τ(x) with some regularity - at least C^1. Here we address this issue for any field τ∈ L^∞(ℝ^d;ℝ^d), without any additional regularity assumption, hence including the case of wild irregular deformations such as a noise on the pixel location of an image. We prove that for signals in multiresolution approximation spaces U_s at scale s, whenever the network is Lipschitz continuous (regardless of its architecture), stability in L^2 holds in the regime τ_L^∞/s≪ 1, essentially as a consequence of the uncertainty principle. When τ_L^∞/s≫ 1 instability can occur even for well-structured DCNNs such as the wavelet scattering networks, and we provide a sharp upper bound for the asymptotic growth rate. The stability results are then extended to signals in the Besov space B^d/2_2,1 tailored to the given multiresolution approximation. We also consider the case of more general time-frequency deformations. Finally, we provide stochastic versions of the aforementioned results, namely we study the issue of stability in mean when τ(x) is modeled as a random field (not bounded, in general) with with identically distributed variables |τ(x)|, x∈ℝ^d.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset