Large language models (LLMs) have demonstrated impressive few-shot learn...
Chain of thought prompting successfully improves the reasoning capabilit...
Text-editing models have recently become a prominent alternative to seq2...
We present EdiT5 - a novel semi-autoregressive text-editing approach des...
This paper presents a simple recipe to train state-of-the-art multilingu...
We propose Masker, an unsupervised text-editing method for style transfe...
We present Felix — a flexible text-editing approach for generation,
desi...
We propose LaserTagger - a sequence tagging approach that casts text
gen...
Unsupervised pre-training of large neural models has recently revolution...
The softmax function on top of a final linear layer is the de facto meth...
We study cross-lingual sequence tagging with little or no labeled data i...
Generative Adversarial Networks (GANs) are a promising approach to langu...
Many popular form factors of digital assistant---such as Amazon Echo, Ap...
In this paper, we study recent neural generative models for text generat...
In this paper, we propose a method for training neural networks when we ...
Training deep neural networks requires massive amounts of training data,...
Despite the impressive improvements achieved by unsupervised deep neural...
This paper presents a novel approach for multi-lingual sentiment
classif...
In this paper we explore the effect of architectural choices on learning...
In this paper, we propose convolutional neural networks for learning an
...
We introduce a globally normalized transition-based neural network model...
This paper presents a novel approach to recurrent neural network (RNN)
r...