Lost in translation: On the impact of data coding on penalized regression with interactions

06/10/2018
by   Johannes W. R. Martini, et al.
0

Penalized regression approaches are standard tools in quantitative genetics. It is known that the fit of an ordinary least squares (OLS) regression is independent of certain transformations of the coding of the predictor variables, and that the standard mixed model ridge regression best linear unbiased prediction (RRBLUP) is neither affected by translations of the variable coding, nor by global scaling. However, it has been reported that an extended version of this mixed model, which incorporates interactions by products of markers as additional predictor variables is affected by translations of the marker coding. In this work, we identify the cause of this loss of invariance in a general context of penalized regression on polynomials in the predictor variables. We show that in most cases, translating the coding of the predictor variables has an impact on effect estimates, with an exception when only the size of the coefficients of monomials of highest total degree are penalized. The invariance of RRBLUP can thus be considered as a special case of this setting, with a polynomial of total degree 1, where the size of the fixed effect (total degree 0) is not penalized but all coefficients of monomials of total degree 1 are. The extended RRBLUP, which includes interactions, is not invariant to translations because it does not only penalize interactions (total degree 2), but also additive effects (total degree 1). Our observations are not restricted to ridge regression, but generally valid for penalized regressions, for instance also for the ℓ_1 penalty of LASSO.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset