Unsupervised Dependency Parsing: Let's Use Supervised Parsers

04/18/2015
by   Phong Le, et al.
0

We present a self-training approach to unsupervised dependency parsing that reuses existing supervised and unsupervised parsing algorithms. Our approach, called `iterated reranking' (IR), starts with dependency trees generated by an unsupervised parser, and iteratively improves these trees using the richer probability models used in supervised parsing that are in turn trained on these trees. Our system achieves 1.8 parser of Spitkovsky et al. (2013) on the WSJ corpus.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset