Language Embeddings for Typology and Cross-lingual Transfer Learning

06/03/2021
by   Dian Yu, et al.
0

Cross-lingual language tasks typically require a substantial amount of annotated data or parallel translation data. We explore whether language representations that capture relationships among languages can be learned and subsequently leveraged in cross-lingual tasks without the use of parallel data. We generate dense embeddings for 29 languages using a denoising autoencoder, and evaluate the embeddings using the World Atlas of Language Structures (WALS) and two extrinsic tasks in a zero-shot setting: cross-lingual dependency parsing and cross-lingual natural language inference.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset