Deep Learning: Generalization Requires Deep Compositional Feature Space Design

06/06/2017
by   Mrinal Haloi, et al.
0

Generalization error defines the discriminability and the representation power of a deep model. In this work, we claim that feature space design using deep compositional function plays a significant role in generalization along with explicit and implicit regularizations. Our claims are being established with several image classification experiments. We show that the information loss due to convolution and max pooling can be marginalized with the compositional design, improving generalization performance. Also, we will show that learning rate decay acts as an implicit regularizer in deep model training.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset