Distinction Maximization Loss: Fast, Scalable, Turnkey, and Native Neural Networks Out-of-Distribution Detection simply by Replacing the SoftMax Loss

08/15/2019
by   David Macêdo, et al.
1

Recently, many methods to reduce neural networks uncertainty have been proposed. However, most of the techniques used in these solutions usually present severe drawbacks. In this paper, we argue that neural networks low out-of-distribution detection performance is mainly due to the SoftMax loss anisotropy. Therefore, we built an isotropic loss to reduce neural networks uncertainty in a fast, scalable, turnkey, and native approach. Our experiments showed that our proposal overcomes ODIN typically by a large margin while producing usually competitive results against state-of-the-art Mahalanobis method while avoiding their limitations.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset