Penalized angular regression for personalized predictions
Personalization is becoming an important feature in many predictive applications. We introduce a penalized regression method implementing personalization inherently in the penalty. Personalized angle (PAN) regression constructs regression coefficients that are specific to the covariate vector for which one is producing a prediction, thus personalizing the regression model itself. This is achieved by penalizing the angles in a hyperspherical parametrization of the regression coefficients. For an orthogonal design matrix, it is shown that the PAN estimate is the solution to a low-dimensional eigenvector equation. Using a parametric bootstrap procedure to select the tuning parameter, simulations show that PAN regression can outperform ordinary least squares and ridge regression in terms of prediction error. We further prove that by combining the PAN penalty with an L_2 penalty the resulting method will have uniformly smaller mean squared prediction error than ridge regression, asymptotically. Finally, we demonstrate the method in a medical application.
READ FULL TEXT