Instruction-tuning has become an integral part of training pipelines for...
Massively multilingual pretrained transformers (MMTs) have tremendously
...
Transferring information retrieval (IR) models from a high-resource lang...
This paper introduces our proposed system for the MIA Shared Task on
Cro...
State-of-the-art neural (re)rankers are notoriously data hungry which - ...
In this work we present a systematic empirical study focused on the
suit...
Pretrained multilingual text encoders based on neural Transformer
archit...
The success of large pretrained language models (LMs) such as BERT and
R...
Current methods of cross-lingual parser transfer focus on predicting the...
Cross-lingual word embeddings (CLEs) enable multilingual modeling of mea...
We propose a fully unsupervised framework for ad-hoc cross-lingual
infor...