Pre-trained language models (PLMs) have achieved great success in NLP an...
In the last five years, there has been a significant focus in Natural
La...
This paper gives a general description of the ideas behind the Parallel
...
Discourse Representation Theory (DRT) is a formal account for representi...
We combine character-level and contextual language model representations...
Meaning banking–creating a semantically annotated corpus for the purpose...
The paper presents the IWCS 2019 shared task on semantic parsing where t...
Programming in Prolog is hard for programmers that are used to procedura...
Idiomatic expressions like `out of the woods' and `up the ante' present ...
The AMR (Abstract Meaning Representation) formalism for representing mea...
Monotonicity reasoning is one of the important reasoning skills for any
...
Large crowdsourced datasets are widely used for training and evaluating
...
Neural methods have had several recent successes in semantic parsing, th...
We investigate the effects of multi-task learning using the recently
int...
Semantic parsing offers many opportunities to improve natural language
u...
The paper proposes the task of universal semantic tagging---tagging word...
We evaluate the character-level translation method for neural semantic
p...
We evaluate a semantic parser based on a character-based sequence-to-seq...
The Parallel Meaning Bank is a corpus of translations annotated with sha...