Open-sourced large language models (LLMs) have demonstrated remarkable
e...
Most existing text generation models follow the sequence-to-sequence
par...
Multilingual pre-trained language models have demonstrated impressive
(z...
We present DualNER, a simple and effective framework to make full use of...
Contrastive learning has become a new paradigm for unsupervised sentence...
Despite low latency, non-autoregressive machine translation (NAT) suffer...
Modern neural machine translation (NMT) models have achieved competitive...
Pretrained language models (PLMs) trained on large-scale unlabeled corpu...
Modern neural machine translation (NMT) models have achieved competitive...
Multimodal machine translation (MMT), which mainly focuses on enhancing
...
Multi-modal neural machine translation (NMT) aims to translate source
se...
Previous studies on the domain adaptation for neural machine translation...
Sentence ordering is to restore the original paragraph from a set of
sen...