Bayesian Conditional Transformation Models
Recent developments in statistical regression methodology establish flexible relationships between all parameters of the response distribution and the covariates. This shift away from pure mean regression is just one example and is further intensified by conditional transformation models (CTMs). They aim to infer the entire conditional distribution directly by applying a transformation function that transforms the response conditionally on a set of covariates towards a simple log-concave reference distribution. Thus, CTMs allow not only variance, kurtosis and skewness but the complete conditional distribution function to depend on the explanatory variables. In this article, we propose a Bayesian notion of conditional transformation models (BCTM) for discrete and continuous responses in the presence of random censoring. Rather than relying on simple polynomials, we implement a spline-based parametrization for monotonic effects that are supplemented with smoothness penalties. Furthermore, we are able to benefit from the Bayesian paradigm directly via easily obtainable credible intervals and other quantities without relying on large sample approximations. A simulation study demonstrates the competitiveness of our approach against its likelihood-based counterpart, most likely transformations (MLTs) and Bayesian additive models of location, scale and shape (BAMLSS). Three applications illustrate the versatility of the BCTMs in problems involving real world data.
READ FULL TEXT