Natural Language Feedback (NLF) is an increasingly popular avenue to ali...
Various design settings for in-context learning (ICL), such as the choic...
Recent studies on transformer-based language models show that they can a...
Sustaining coherent and engaging narratives requires dialogue or storyte...
Language models (LMs) have recently shown remarkable performance on reas...
Recent methods demonstrate that data augmentation using counterfactual
k...
In this paper, we present kogito, an open-source tool for generating
com...
Understanding rich narratives, such as dialogues and stories, often requ...
Pretraining a language model (LM) on text has been shown to help various...
Even the largest neural networks make errors, and once-correct predictio...
Multilingual pre-trained language models perform remarkably well on
cros...
Conditional set generation learns a mapping from an input sequence of to...
Automated fact-checking is a needed technology to curtail the spread of
...
Answering complex questions about textual narratives requires reasoning ...
While large pre-trained models have enabled impressive results on a vari...
One of the challenges faced by conversational agents is their inability ...
Natural language inference requires reasoning about contradictions,
nega...
The problem of answering questions using knowledge from pre-trained lang...
We introduce GEM, a living benchmark for natural language Generation (NL...
Despite considerable advancements with deep neural language models (LMs)...
Providing natural language processing systems with commonsense knowledge...
Multimodal disinformation, from `deepfakes' to simple edits that deceive...
Recent years have brought about a renewed interest in commonsense
repres...
Abductive and counterfactual reasoning, core abilities of everyday human...
Procedural texts often describe processes (e.g., photosynthesis and cook...
Understanding narratives requires dynamically reasoning about the implic...
Automatic KB completion for commonsense knowledge graphs (e.g., ATOMIC a...
Our goal is to better comprehend procedural text, e.g., a paragraph abou...
We introduce WIQA, the first large-scale dataset of "What if..." questio...
Counterfactual reasoning requires predicting how alternative events, con...
We introduce Cooperative Generator-Discriminator Networks (Co-opNet), a
...
Our goal is procedural text comprehension, namely tracking how the prope...
We present the first comprehensive study on automatic knowledge base
con...
Large-scale learning of transformer language models has yielded improvem...
Comprehending procedural text, e.g., a paragraph describing photosynthes...
Understanding a narrative requires reading between the lines and reasoni...
Recurrent Neural Networks (RNNs) are powerful autoregressive sequence mo...
In this paper, we investigate the use of discourse-aware rewards with
re...
We present deep communicating agents in an encoder-decoder architecture ...
Understanding procedural language requires anticipating the causal effec...