Much of the previous work towards digital agents for graphical user
inte...
Internet links enable users to deepen their understanding of a topic by
...
Formulating selective information needs results in queries that implicit...
We propose Conditional Adapter (CoDA), a parameter-efficient transfer
le...
Large-scale multi-modal pre-training models such as CLIP and PaLI exhibi...
We introduce the task of recommendation set generation for entity-orient...
Despite their success, large pre-trained multilingual models have not
co...
We study multi-answer retrieval, an under-explored problem that requires...
In many applications of machine learning, certain categories of examples...
We review the EfficientQA competition from NeurIPS 2020. The competition...
The traditional image captioning task uses generic reference captions to...
Multilingual question answering tasks typically assume answers exist in ...
We address the problem of extractive question answering using document-l...
We present a method to represent input texts by contextualizing them joi...
Language model pre-training has been shown to capture a surprising amoun...
Reading comprehension models have been successfully applied to extractiv...
Recent developments in NLP have been accompanied by large, expensive mod...
We present the zero-shot entity linking task, where mentions must be lin...
Recent work on open domain question answering (QA) assumes strong superv...
In this paper we study yes/no questions that are naturally occurring ---...
Hierarchical neural architectures are often used to capture long-distanc...
This technical note describes a new baseline for the Natural Questions. ...
We study approaches to improve fine-grained short answer Question Answer...
We introduce a new language representation model called BERT, which stan...
We introduce the syntactic scaffold, an approach to incorporating syntac...
Recent BIO-tagging-based neural semantic role labeling models are very h...
LSTMs were introduced to combat vanishing gradients in simple RNNs by
au...
We introduce a fully differentiable approximation to higher-order infere...
We introduce a new type of deep contextualized word representation that
...
We introduce the first end-to-end coreference resolution model and show ...
We introduce recurrent additive networks (RANs), a new gated RNN which i...
The reading comprehension task, that asks questions about a given eviden...
We introduce the first global recursive neural parsing model with optima...