Layer Flexible Adaptive Computational Time for Recurrent Neural Networks

12/06/2018
by   Lida Zhang, et al.
12

Deep recurrent neural networks show significant benefits in prediction tasks, but a universal problem is deciding the number of layers for the network, especially considering the different computation requests for tasks of different difficulties. We propose a layer flexible recurrent neural network with adaptive computational time, and expand it to sequence to sequence model with teacher-forcing based input policies. The model applies the attention mechanism on all the computation rounds in each step to structure a transmission state for each layer individually in the next step. We evaluate the model on the problem of tick prices prediction. Experimental results show the performance improvement and indicate the model's ability to dynamically change the number of layers.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset