A Two-Step Rule for Backpropagation

03/17/2023
by   Ahmed Boughammoura, et al.
0

We present a simplified computational rule for the back-propagation formulas for artificial neural networks. In this work, we provide a generic two-step rule for the back-propagation algorithm in matrix notation. Moreover, this rule incorporates both the forward and backward phases of the computations involved in the learning process. Specifically, this recursive computing rule permits the propagation of the changes to all synaptic weights in the network, layer by layer, efficiently. In particular, we use this rule to compute both the up and down partial derivatives of the cost function of all the connections feeding into the output layer.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset