Sentiment Analysis with Contextual Embeddings and Self-Attention

03/12/2020
by   Katarzyna Biesialska, et al.
7

In natural language the intended meaning of a word or phrase is often implicit and depends on the context. In this work, we propose a simple yet effective method for sentiment analysis using contextual embeddings and a self-attention mechanism. The experimental results for three languages, including morphologically rich Polish and German, show that our model is comparable to or even outperforms state-of-the-art models. In all cases the superiority of models leveraging contextual embeddings is demonstrated. Finally, this work is intended as a step towards introducing a universal, multilingual sentiment classifier.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset