Character-Level Translation with Self-attention

04/30/2020
by   Yingqiang Gao, et al.
0

We explore the suitability of self-attention models for character-level neural machine translation. We test the standard transformer model, as well as a novel variant in which the encoder block combines information from nearby characters using convolutions. We perform extensive experiments on WMT and UN datasets, testing both bilingual and multilingual translation to English using up to three input languages (French, Spanish, and Chinese). Our transformer variant consistently outperforms the standard transformer at the character-level and converges faster while learning more robust character-level alignments.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset