While natural languages differ widely in both canonical word order and w...
We advance an information-theoretic model of human language processing i...
Because meaning can often be inferred from lexical semantics alone, word...
The combinatorial power of language has historically been argued to be
e...
We introduce a theoretical framework for understanding and predicting th...
We investigate how Multilingual BERT (mBERT) encodes grammar by examinin...
Languages vary in their placement of multiple adjectives before, after, ...
Humans can learn structural properties about a word from minimal experie...
Deep learning sequence models have led to a marked increase in performan...
Recurrent Neural Networks (RNNs) trained on a language modeling task hav...
We deploy the methods of controlled psycholinguistic experimentation to ...
State-of-the-art LSTM language models trained on large corpora learn
seq...
RNN language models have achieved state-of-the-art results on various ta...
Recurrent neural networks (RNNs) are the state of the art in sequence
mo...
RNN language models have achieved state-of-the-art perplexity results an...
A frequent object of study in linguistic typology is the order of elemen...
It is now a common practice to compare models of human language processi...
We address recent criticisms (Liu et al., 2015; Ferrer-i-Cancho and
Góme...