Model selection-based estimation for generalized additive models using mixtures of g-priors: Towards systematization

01/25/2023
by   Gyeonghun Kang, et al.
0

We consider estimation of generalized additive models using basis expansions with Bayesian model selection. Although Bayesian model selection is an intuitively appealing tool for regression splines by virtue of the flexible knot placement and model-averaged function estimates, its use has traditionally been limited to Gaussian additive regression, as posterior search of the model space requires a tractable form of the marginal model likelihood. We introduce an extension of the method to the exponential family of distributions using the Laplace approximation to the likelihood. Although the Laplace approximation is successful with all Gaussian-type prior distributions in providing a closed-form expression of the marginal likelihood, there is no broad consensus on the best prior distribution to be used for nonparametric regression via model selection. We observe that the classical unit information prior distribution for variable selection may not be suitable for nonparametric regression using basis expansions. Instead, our study reveals that mixtures of g-priors are more suitable. A large family of mixtures of g-priors is considered for a detailed examination of how various mixture priors perform in estimating generalized additive models. Furthermore, we compare several priors of knots for model selection-based spline approaches to determine the most practically effective scheme. The model selection-based estimation methods are also compared with other Bayesian approaches to function estimation. Extensive simulation studies demonstrate the validity of the model selection-based approaches. We provide an R package for the proposed method.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset