Neural Networks, Hypersurfaces, and Radon Transforms

07/04/2019
by   Soheil Kolouri, et al.
6

Connections between integration along hypersufaces, Radon transforms, and neural networks are exploited to highlight an integral geometric mathematical interpretation of neural networks. By analyzing the properties of neural networks as operators on probability distributions for observed data, we show that the distribution of outputs for any node in a neural network can be interpreted as a nonlinear projection along hypersurfaces defined by level surfaces over the input data space. We utilize these descriptions to provide new interpretation for phenomena such as nonlinearity, pooling, activation functions, and adversarial examples in neural network-based learning problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset