AMR Parsing as Sequence-to-Graph Transduction

05/21/2019
by   Sheng Zhang, et al.
0

We propose an attention-based model that treats AMR parsing as sequence-to-graph transduction. Unlike most AMR parsers that rely on pre-trained aligners, external semantic resources, or data augmentation, our proposed parser is aligner-free, and it can be effectively trained with limited amounts of labeled AMR data. Our experimental results outperform all previously reported SMATCH scores, on both AMR 2.0 (76.3 (70.2

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset