Neural networks with superexpressive activations and integer weights

05/20/2021
by   Aleksandr Beknazaryan, et al.
0

An example of an activation function σ is given such that networks with activations {σ, ⌊·⌋}, integer weights and a fixed architecture depending on d approximate continuous functions on [0,1]^d. The range of integer weights required for ε-approximation of Hölder continuous functions is derived, which leads to a convergence rate of order n^-2β/2β+dlog_2n for neural network regression estimation of unknown β-Hölder continuous function with given n samples.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset