Off-the-Shelf Unsupervised NMT

11/06/2018
by   Chris Hokamp, et al.
0

We frame unsupervised machine translation (MT) in the context of multi-task learning (MTL), combining insights from both directions. We leverage off-the-shelf neural MT architectures to train unsupervised MT models with no parallel data and show that such models can achieve reasonably good performance, competitive with models purpose-built for unsupervised MT. Finally, we propose improvements that allow us to apply our models to English-Turkish, a truly low-resource language pair.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset