Randomized Algorithms for Computation of Tucker decomposition and Higher Order SVD (HOSVD)

01/20/2020
by   Salman Ahmadi Asl, et al.
0

Big data analysis has become a crucial part of new emerging technologies such as Internet of thing (IOT), cyber-physical analysis, deep learning, anomaly detection etc. Among many other techniques, dimensionality reduction plays a key role in such analyses and facilitate the procedure of feature selection and feature extraction. Randomized algorithms are efficient tools for handling big data tensors. They accelerate decomposing large-scale data tensors by reducing the computational complexity of deterministic algorithms and also reducing the communication among different levels of memory hierarchy which is a main bottleneck in modern computing environments and architectures. In this paper, we review recent advances in randomization for computation of Tucker decomposition and Higher Order SVD (HOSVD). We discuss both random projection and sampling approaches and also single-pass and multi-pass randomized algorithms and how they can be utilized in computation of Tucker decomposition and HOSVD. Simulations on real data including weight tensors of fully connected layers of pretrained VGG-16 and VGG-19 deep neural networks and also CIFAR-10 and CIFAR-100 datasets are provided to compare performance of some of the presented algorithms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset