Neutron: An Implementation of the Transformer Translation Model and its Variants

03/18/2019
by   Hongfei Xu, et al.
0

The Transformer translation model is easier to parallelize and provides better performance comparing with recurrent seq2seq models, which makes it popular among industry and research community. We implement Neutron in this work, including the Transformer model and several variants from most recent researches. It is easier to modify and provides comparable performance with interesting features while keep readability.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset