Neural networks with superexpressive activations and integer weights
An example of an activation function σ is given such that networks with activations {σ, ⌊·⌋}, integer weights and a fixed architecture depending on d approximate continuous functions on [0,1]^d. The range of integer weights required for ε-approximation of Hölder continuous functions is derived, which leads to a convergence rate of order n^-2β/2β+dlog_2n for neural network regression estimation of unknown β-Hölder continuous function with given n samples.
READ FULL TEXT