On Principal Component Regression in a High-Dimensional Error-in-Variables Setting

10/27/2020
by   Anish Agarwal, et al.
4

We analyze the classical method of Principal Component Regression (PCR) in the high-dimensional error-in-variables setting. Here, the observed covariates are not only noisy and contain missing data, but the number of covariates can also exceed the sample size. Under suitable conditions, we establish that PCR identifies the unique model parameter with minimum ℓ_2-norm, and derive non-asymptotic ℓ_2-rates of convergence that show its consistency. We further provide non-asymptotic out-of-sample prediction performance guarantees that again prove consistency, even in the presence of corrupted unseen data. Notably, our results do not require the out-of-samples covariates to follow the same distribution as that of the in-sample covariates, but rather that they obey a simple linear algebraic constraint. We finish by presenting simulations that illustrate our theoretical results.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset