Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification

02/06/2015
by   Kaiming He, et al.
0

Rectified activation units (rectifiers) are essential for state-of-the-art neural networks. In this work, we study rectifier neural networks for image classification from two aspects. First, we propose a Parametric Rectified Linear Unit (PReLU) that generalizes the traditional rectified unit. PReLU improves model fitting with nearly zero extra computational cost and little overfitting risk. Second, we derive a robust initialization method that particularly considers the rectifier nonlinearities. This method enables us to train extremely deep rectified models directly from scratch and to investigate deeper or wider network architectures. Based on our PReLU networks (PReLU-nets), we achieve 4.94 classification dataset. This is a 26 winner (GoogLeNet, 6.66 human-level performance (5.1 challenge.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset