Much of the previous work towards digital agents for graphical user
inte...
Large-scale multi-modal pre-training models such as CLIP and PaLI exhibi...
Despite its importance to experimental design, statistical power (the
pr...
We introduce k-nearest-neighbor machine translation (kNN-MT), which
pred...
We introduce kNN-LMs, which extend a pre-trained neural language model (...
It can be challenging to train multi-task neural networks that outperfor...
Large pre-trained neural networks such as BERT have had great recent suc...
Language model (LM) pre-training has resulted in impressive performance ...
We know very little about how neural language models (LM) use prior
ling...