Two-sided Matrix Regression

03/08/2023
by   Nayel Bettache, et al.
0

The two-sided matrix regression model Y = A^*X B^* +E aims at predicting Y by taking into account both linear links between column features of X, via the unknown matrix B^*, and also among the row features of X, via the matrix A^*. We propose low-rank predictors in this high-dimensional matrix regression model via rank-penalized and nuclear norm-penalized least squares. Both criteria are non jointly convex; however, we propose explicit predictors based on SVD and show optimal prediction bounds. We give sufficient conditions for consistent rank selector. We also propose a fully data-driven rank-adaptive procedure. Simulation results confirm the good prediction and the rank-consistency results under data-driven explicit choices of the tuning parameters and the scaling parameter of the noise.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset