Towards Realistic Practices In Low-Resource Natural Language Processing: The Development Set

09/04/2019
by   Katharina Kann, et al.
0

Development sets are impractical to obtain for real low-resource languages, since using all available data for training is often more effective. However, development sets are widely used in research papers that purport to deal with low-resource natural language processing (NLP). Here, we aim to answer the following questions: Does using a development set for early stopping in the low-resource setting influence results as compared to a more realistic alternative, where the number of training epochs is tuned on development languages? And does it lead to overestimation or underestimation of performance? We repeat multiple experiments from recent work on neural models for low-resource NLP and compare results for models obtained by training with and without development sets. On average over languages, absolute accuracy differs by up to 1.4 as big as 18.0 experimental setups in the publication of low-resource NLP research results.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset