Large language models (LLMs) demonstrate remarkable ability to comprehen...
k-Nearest neighbor machine translation (kNN-MT) has attracted increasing...
Interactive Natural Language Processing (iNLP) has emerged as a novel
pa...
Fine-grained information on translation errors is helpful for the transl...
Neural machine translation (NMT) is often criticized for failures that h...
In this paper, we present our submission to the sentence-level MQM bench...
In this report, we present our submission to the WMT 2022 Metrics Shared...
Product description generation is a challenging and under-explored task....
Recent literature focuses on utilizing the entity information in the
sen...
We study dangling-aware entity alignment in knowledge graphs (KGs), whic...
Attribute-based Controlled Text Generation (CTG) refers to generating
se...
Attention mechanism has become the dominant module in natural language
p...
In this paper, we present our submission to Shared Metrics Task: RoBLEUR...
Translation quality evaluation plays a crucial role in machine translati...
Beam search is the most widely used decoding method for neural machine
t...
Generative commonsense reasoning requires machines to generate sentences...
Interactive and non-interactive model are the two de-facto standard
fram...
A good translation should not only translate the original content
semant...
A well-known limitation in pretrain-finetune paradigm lies in its
inflex...
In open-domain conversational systems, it is important but challenging t...
Transformer is an attention-based neural network, which consists of two
...
In this paper, we propose a novel data augmentation method, referred to ...
Ancient Chinese is the essence of Chinese culture. There are several nat...
Reading long documents to answer open-domain questions remains challengi...
The generation of humor is an under-explored and challenging problem.
Pr...
News headline generation aims to produce a short sentence to attract rea...
In recent years, the automatic generation of classical Chinese poetry ha...
In this paper, we present a new sequence-to-sequence pre-training model
...
In this work, we demonstrate a Chinese classical poetry generation syste...
As an important format of multimedia, music has filled almost everyone's...
Typical methods for unsupervised text style transfer often rely on two k...
Text infilling is defined as a task for filling in the missing part of a...
It has been previously observed that training Variational Recurrent
Auto...
Ancient Chinese brings the wisdom and spirit culture of the Chinese nati...
Recent studies in sequence-to-sequence learning demonstrate that RNN
enc...
In many natural language generation tasks, incorporating additional know...