Information criteria for sparse methods in causal inference

03/29/2022
by   Yoshiyuki Ninomiya, et al.
0

For propensity score analysis and sparse estimation, we develop an information criterion for determining the regularization parameters needed in variable selection. First, for Gaussian distribution-based causal inference models, we extend Stein's unbiased risk estimation theory, which leads to a generalized Cp criterion that has almost no weakness in conventional sparse estimation, and derive an inverse-probability-weighted sparse estimation version of the criterion without resorting to asymptotics. Next, for general causal inference models that are not necessarily Gaussian distribution-based, we extend the asymptotic theory on LASSO for propensity score analysis, with the intention of implementing doubly robust sparse estimation. From the asymptotic theory, an AIC-type information criterion for inverse-probability-weighted sparse estimation is given, and then a criterion with double robustness in itself is derived for doubly robust sparse estimation. Numerical experiments compare the proposed criterion with the existing criterion derived from a formal argument and verify that the proposed criterion is superior in almost all cases, that the difference is not negligible in many cases, and that the results of variable selection differ significantly. Real data analysis confirms that the difference between variable selection and estimation by these criteria is actually large. Finally, generalizations to general sparse estimation using group LASSO, elastic net, and non-convex regularization are made in order to indicate that the proposed criterion is highly extensible.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset