Backpropagation through nonlinear units for all-optical training of neural networks

06/30/2020
by   Thomas D. Barrett, et al.
0

Backpropagation through nonlinear neurons is an outstanding challenge to the field of optical neural networks and the major conceptual barrier to all-optical training schemes. Each neuron is required to exhibit a directionally dependent response to propagating optical signals, with the backwards response conditioned on the forward signal, which is highly non-trivial to implement optically. We propose a practical and surprisingly simple solution that uses saturable absorption to provide the network nonlinearity. We find that the backward propagating gradients required to train the network can be approximated in a pump-probe scheme that requires only passive optical elements. Simulations show that, with readily obtainable optical depths, our approach can achieve equivalent performance to state-of-the-art computational networks on image classification benchmarks, even in deep networks with multiple sequential gradient approximations. This scheme is compatible with leading optical neural network proposals and therefore provides a feasible path towards end-to-end optical training.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset