Document Context Neural Machine Translation with Memory Networks
We present a document-level neural machine translation model which takes both the source and target document contexts into account using memory networks. We model the problem as a structured prediction problem with interdependencies among the observed and hidden variables, i.e., the source sentences and their unobserved target translations in the document. The resulting structured prediction problem is tackled with a neural translation model equipped with two memory components, one each for the source and target, to capture the documental interdependencies. We train the model end-to-end and propose an iterative decoding algorithm based on the block-coordinate descent. Experimental results and analysis on translating French, German, and Estonian documents to English show that our model is effective in exploiting both source and target document contexts to generate improved translations.
READ FULL TEXT