Current pre-trained language models have enabled remarkable improvements...
Current pre-trained language models have enabled remarkable improvements...
A critical component of competence in language is being able to identify...
A characteristic feature of human semantic memory is its ability to not ...
To what extent can experience from language contribute to our conceptual...
While sentence anomalies have been applied periodically for testing in N...
As pre-trained language models (LMs) continue to dominate NLP, it is
inc...
Pre-trained LMs have shown impressive performance on downstream NLP task...
Pre-trained transformer language models have shown remarkable performanc...
Building on research arguing for the possibility of conceptual and
categ...
Deep transformer models have pushed performance on NLP tasks to new limi...
Models trained to estimate word probabilities in context have become
ubi...
Long document coreference resolution remains a challenging task due to t...
Fine-tuning a pretrained transformer for a downstream task has become a
...
We propose PeTra, a memory-augmented neural network designed to track
en...
Although models using contextual word embeddings have achieved
state-of-...
Pre-training by language modeling has become a popular and successful
ap...
An important component of achieving language understanding is mastering ...
This paper presents a summary of the first Workshop on Building
Linguist...