Problem-dependent attention and effort in neural networks with an application to image resolution

01/05/2022
by   Chris Rohlfs, et al.
0

This paper introduces a new neural network-based estimation approach that is inspired by the biological phenomenon whereby humans and animals vary the levels of attention and effort that they dedicate to a problem depending upon its difficulty. The proposed approach leverages alternate models' internal levels of confidence in their own projections. If the least costly model is confident in its classification, then that is the classification used; if not, the model with the next lowest cost of implementation is run, and so on. This use of successively more complex models – together with the models' internal propensity scores to evaluate their likelihood of being correct – makes it possible to substantially reduce resource use while maintaining high standards for classification accuracy. The approach is applied to the digit recognition problem from Google's Street View House Numbers dataset, using Multilayer Perceptron (MLP) neural networks trained on high- and low-resolution versions of the digit images. The algorithm examines the low-resolution images first, only moving to higher resolution images if the classification from the initial low-resolution pass does not have a high degree of confidence. For the MLPs considered here, this sequential approach enables a reduction in resource usage of more than 50% without any sacrifice in classification accuracy.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset