Chinese Grammatical Correction Using BERT-based Pre-trained Model

11/04/2020
by   Hongfei Wang, et al.
0

In recent years, pre-trained models have been extensively studied, and several downstream tasks have benefited from their utilization. In this study, we verify the effectiveness of two methods that incorporate a BERT-based pre-trained model developed by Cui et al. (2020) into an encoder-decoder model on Chinese grammatical error correction tasks. We also analyze the error type and conclude that sentence-level errors are yet to be addressed.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset