The goal of this study is to investigate whether a Transformer-based neu...
The development of machines that «talk like
us», also known as Natural L...
People constantly use language to learn about the world. Computational
l...
Word order, an essential property of natural languages, is injected in
T...
Both humans and neural language models are able to perform subject-verb
...
A central quest of probing is to uncover how pre-trained models encode a...
Although transformer-based Neural Language Models demonstrate impressive...
Prior research has explored the ability of computational models to predi...
Distributional semantics has deeply changed in the last decades. First,
...
Most compositional distributional semantic models represent sentence mea...
Despite the number of NLP studies dedicated to thematic fit estimation,
...
In this paper, we introduce a new distributional method for modeling
pre...
In Distributional Semantic Models (DSMs), Vector Cosine is widely used t...
Several studies on sentence processing suggest that the mental lexicon k...
In this paper, we claim that vector cosine, which is generally considere...
In this paper, we describe ROOT13, a supervised system for the classific...
ROOT9 is a supervised system for the classification of hypernyms, co-hyp...
In this paper, we claim that Vector Cosine, which is generally considere...