SAU: Smooth activation function using convolution with approximate identities

09/27/2021
by   Koushik Biswas, et al.
0

Well-known activation functions like ReLU or Leaky ReLU are non-differentiable at the origin. Over the years, many smooth approximations of ReLU have been proposed using various smoothing techniques. We propose new smooth approximations of a non-differentiable activation function by convolving it with approximate identities. In particular, we present smooth approximations of Leaky ReLU and show that they outperform several well-known activation functions in various datasets and models. We call this function Smooth Activation Unit (SAU). Replacing ReLU by SAU, we get 5.12 ShuffleNet V2 (2.0x) model on CIFAR100 dataset.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset