Knowledge Distillation Applied to Optical Channel Equalization: Solving the Parallelization Problem of Recurrent Connection

12/08/2022
by   Sasipim Srivallapanondh, et al.
0

To circumvent the non-parallelizability of recurrent neural network-based equalizers, we propose knowledge distillation to recast the RNN into a parallelizable feedforward structure. The latter shows 38% latency decrease, while impacting the Q-factor by only 0.5dB.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset