Large language models are trained in two stages: (1) unsupervised pretra...
Autoregressive transformers are spectacular models for short sequences b...
We propose a new two-stage pre-training framework for video-to-text
gene...
Generative language models define distributions over sequences of tokens...
Lack of factual correctness is an issue that still plagues state-of-the-...
We present a method for generating comparative summaries that highlights...
We introduce Nutri-bullets, a multi-document summarization task for
heal...
Selecting input features of top relevance has become a popular method fo...
Natural language systems often rely on a single, potentially ambiguous i...
Response suggestion is an important task for building human-computer
con...