Syntax Representation in Word Embeddings and Neural Networks – A Survey

10/02/2020
by   Tomasz Limisiewicz, et al.
0

Neural networks trained on natural language processing tasks capture syntax even though it is not provided as a supervision signal. This indicates that syntactic analysis is essential to the understating of language in artificial intelligence systems. This overview paper covers approaches of evaluating the amount of syntactic information included in the representations of words for different neural network architectures. We mainly summarize re-search on English monolingual data on language modeling tasks and multilingual data for neural machine translation systems and multilingual language models. We describe which pre-trained models and representations of language are best suited for transfer to syntactic tasks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset