A More Efficient Chinese Named Entity Recognition base on BERT and Syntactic Analysis

01/11/2021
by   Xiao Fu, et al.
0

We propose a new Named entity recognition (NER) method to effectively make use of the results of Part-of-speech (POS) tagging, Chinese word segmentation (CWS) and parsing while avoiding NER error caused by POS tagging error. This paper first uses Stanford natural language process (NLP) tool to annotate large-scale untagged data so as to reduce the dependence on the tagged data; then a new NLP model, g-BERT model, is designed to compress Bidirectional Encoder Representations from Transformers (BERT) model in order to reduce calculation quantity; finally, the model is evaluated based on Chinese NER dataset. The experimental results show that the calculation quantity in g-BERT model is reduced by 60 compared with that in BERT model.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset