In the last half-decade, the field of natural language processing (NLP) ...
In this paper, we evaluate the translation of negation both automaticall...
Standard models for syntactic dependency parsing take words to be the
el...
Since the popularization of the Transformer as a general-purpose feature...
Recent work has shown that deeper character-based neural machine transla...
We study the effect of rich supertag features in greedy transition-based...
We generalize principal component analysis for embedding words into a ve...
We present Køpsala, the Copenhagen-Uppsala system for the Enhanced
Unive...
Recent work on the interpretability of deep neural language models has
c...
Universal Dependencies is an open community effort to create
cross-lingu...
Neural machine translation (NMT) has achieved new state-of-the-art
perfo...
Transition-based and graph-based dependency parsers have previously been...
In this paper, we try to understand neural machine translation (NMT) via...
This article is a linguistic investigation of a neural parser. We look a...
The need for tree structure modelling on top of sequence modelling is an...
Recent work has shown that the encoder-decoder attention mechanisms in n...
We present the Uppsala system for the CoNLL 2018 Shared Task on universa...
We provide a comprehensive analysis of the interactions between pre-trai...
Word segmentation is a low-level NLP task that is non-trivial for a
cons...
In this paper, we apply different NMT models to the problem of historica...
How to make the most of multiple heterogeneous treebanks when training a...
Sentences with gapping, such as Paul likes coffee and Mary tea, lack an ...
We present a character-based model for joint segmentation and POS taggin...
We study the use of greedy feature selection methods for morphosyntactic...