Rational neural networks

04/04/2020
by   Nicolas Boullé, et al.
33

We consider neural networks with rational activation functions. The choice of the nonlinear activation function in deep learning architectures is crucial and heavily impacts the performance of a neural network. We establish optimal bounds in terms of network complexity and prove that rational neural networks approximate smooth functions more efficiently than ReLU networks. The flexibility and smoothness of rational activation functions make them an attractive alternative to ReLU, as we demonstrate with numerical experiments.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset