Pre-trained multilingual language models underpin a large portion of mod...
Despite that Transformers perform well in NLP tasks, recent studies sugg...
Feature attribution aims to explain the reasoning behind a black-box mod...
We investigate the extent to which verb alternation classes, as describe...
To what extent do pre-trained language models grasp semantic knowledge
r...
We formulate and test a technique to use Emergent Communication (EC) wit...
We investigate the semantic knowledge of language models (LMs), focusing...
We introduce a multilabel probing task to assess the morphosyntactic
rep...
Segmentation remains an important preprocessing step both in languages w...
This work presents methods for learning cross-lingual sentence
represent...
Agent-based models and signalling games are useful tools with which to s...
Although large-scale pretrained language models, such as BERT and RoBERT...
We propose a general framework to study language emergence through signa...
All natural languages exhibit a distinction between content words (like ...
How are the meanings of linguistic expressions related to their use in
c...
We study the role of linguistic context in predicting quantifiers (`few'...