Character-Word LSTM Language Models

04/10/2017
by   Lyan Verwimp, et al.
0

We present a Character-Word Long Short-Term Memory Language Model which both reduces the perplexity with respect to a baseline word-level language model and reduces the number of parameters of the model. Character information can reveal structural (dis)similarities between words and can even be used when a word is out-of-vocabulary, thus improving the modeling of infrequent and unknown words. By concatenating word and character embeddings, we achieve up to 2.77 improvement on English compared to a baseline model with a similar amount of parameters and 4.57 models with a larger number of parameters.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset