Minimal Supervision for Morphological Inflection

04/17/2021
by   Omer Goldman, et al.
0

Neural models for the various flavours of morphological inflection tasks have proven to be extremely accurate given ample labeled data – data that may be slow and costly to obtain. In this work we aim to overcome this annotation bottleneck by bootstrapping labeled data from a seed as little as five labeled paradigms, accompanied by a large bulk of unlabeled text. Our approach exploits different kinds of regularities in morphological systems in a two-phased setup, where word tagging based on analogies is followed by word pairing based on distances. We experiment with the Paradigm Cell Filling Problem over eight typologically different languages, and find that, in languages with relatively simple morphology, orthographic regularities on their own allow inflection models to achieve respectable accuracy. Combined orthographic and semantic regularities alleviate difficulties with particularly complex morpho-phonological systems. Our results suggest that hand-crafting many tagged examples might be an unnecessary effort. However, more work is needed in order to address rarely used forms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset