Anticipate, Ensemble and Prune: Improving Convolutional Neural Networks via Aggregated Early Exits

01/28/2023
by   Simone Sarti, et al.
0

Today, artificial neural networks are the state of the art for solving a variety of complex tasks, especially in image classification. Such architectures consist of a sequence of stacked layers with the aim of extracting useful information and having it processed by a classifier to make accurate predictions. However, intermediate information within such models is often left unused. In other cases, such as in edge computing contexts, these architectures are divided into multiple partitions that are made functional by including early exits, i.e. intermediate classifiers, with the goal of reducing the computational and temporal load without extremely compromising the accuracy of the classifications. In this paper, we present Anticipate, Ensemble and Prune (AEP), a new training technique based on weighted ensembles of early exits, which aims at exploiting the information in the structure of networks to maximise their performance. Through a comprehensive set of experiments, we show how the use of this approach can yield average accuracy improvements of up to 15 internal pruning operation also allows reducing the number of parameters by up to 41 latency time to make inference by 16 learn weights that allow early exits to achieve better accuracy values than those obtained from single-output reference models.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset