Recurrent Graph Tensor Networks

09/18/2020
by   Yao Lei Xu, et al.
0

Recurrent Neural Networks (RNNs) are among the most successful machine learning models for sequence modelling. In this paper, we show that the modelling of hidden states in RNNs can be approximated through a multi-linear graph filter, which describes the directional flow of temporal information. The derived multi-linear graph filter is then generalized in tensor network form to improve its modelling power, resulting in a novel Recurrent Graph Tensor Network (RGTN). To validate the expressive power of the derived network, several variants RGTN models were porposed and employed to the task of time-series forecasting, demonstrating superior properties in terms of convergence, performance, and complexity. Specifically, by leveraging the multi-modal nature of tensor networks, RGTN models were able to out-perform a simple RNN by 45 parameters. Therefore, by combining the expressive power of tensor networks with a suitable graph filter, we show that the proposed RGTN can out-perform a classical RNN at a drastically lower parameters complexity, especially in the multi-modal setting.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset