Cramnet: Layer-wise Deep Neural Network Compression with Knowledge Transfer from a Teacher Network
Neural Networks accomplish amazing things, but they suffer from computational and memory bottlenecks that restrict their usage. Nowhere can this be better seen than in the mobile space, where specialized hardware is being created just to satisfy the demand for neural networks. Previous studies have shown that neural networks have vastly more connections than they actually need to do their work. This thesis develops a method that can compress networks to less than 10 accuracy, and without creating sparse networks that require special code to run.
READ FULL TEXT