IDK Cascades: Fast Deep Learning by Learning not to Overthink
Advances in deep learning have led to substantial increases in prediction accuracy but have been accompanied by increases in the cost of rendering predictions. We conjecture that for a majority of real-world inputs, the recent advances in deep learning have created models that effectively "over-think" on simple inputs. In this paper we revisit the question of how to effectively build model cascades to reduce prediction costs. While classic cascade techniques primarily leverage class asymmetry to reduce cost, we extend this approach to arbitrary multi-class prediction tasks. We introduce the "I Don't Know" (IDK) prediction cascades framework, a general framework for composing a set of pre-trained models to accelerate inference without a loss in prediction accuracy. We propose two search based methods for constructing cascades as well as a new cost-aware objective within this framework. We evaluate these techniques on a range of both benchmark and real-world datasets and demonstrate that prediction cascades can reduce computation by 37 speedups in image classification tasks over state-of-the-art models without a loss in accuracy. Furthermore, on a driving motion prediction task evaluated on a large scale autonomous driving dataset, prediction cascades achieved 95 accuracy when combined with human experts, while requiring human intervention on less than 30
READ FULL TEXT