Seq2Tens: An Efficient Representation of Sequences by Low-Rank Tensor Projections

06/12/2020
by   Csaba Tóth, et al.
0

Sequential data such as time series, video, or text can be challenging to analyse as the ordered structure gives rise to complex dependencies. At the heart of this is non-commutativity, in the sense that reordering the elements of a sequence can completely change its meaning. We use a classical mathematical object – the tensor algebra – to capture such dependencies. To address the innate computational complexity of high degree tensors, we use compositions of low-rank tensor projections. This yields modular and scalable building blocks for neural networks that give state-of-the-art performance on standard benchmarks such as multivariate time series classification and generative models for video.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset