Modelling Bahdanau Attention using Election methods aided by Q-Learning
Neural Machine Translation has lately gained a lot of "attention" with the advent of more and more sophisticated but drastically improved models. Attention mechanism has proved to be a boon in this direction by providing weights to the input words, making it easy for the decoder to identify words representing the present context. But by and by, the newer attention models being more complex involved large computation, making inference slow. In this paper, we have modelled the attention network using techniques resonating with social choice theory. Along with that, attention mechanism, being a Markov Decision Process, should be, in theory, representable by reinforcement learning techniques. Thus, we propose to use an election method (k-Borda), fine-tuned using Q-learning, as a replacement for attention networks. The inference time for this network is less than a standard Bahdanau translator, and the results of the translation are comparable. This not only experimentally verifies the claims stated above but also helps provide a faster inference.
READ FULL TEXT