Principled Bayesian Minimum Divergence Inference
When it is acknowledged that all candidate parameterised statistical models are misspecified relative to the data generating process, the decision maker must concern themselves with the KL-divergence minimising parameter in order to maintain principled statistical practice (Walker, 2013). However, it has long been known that the KL-divergence places a large weight on correctly capturing the tails of the data generating process. As a result traditional inference can be very non-robust. In this paper we advance recent methodological developments in general Bayesian updating (Bissiri, Holmes and Walker, 2016) to propose a statistically well principled Bayesian updating of beliefs targeting the minimisation of any statistical divergence. We improve both the motivation and the statistical foundations of existing Bayesian minimum divergence estimation (Hooker and Vidyashankar, 2014, Ghosh and Basu, 2016), for the first time allowing the well principled Bayesian to target predictions from the model that are close to the data generating process in terms of some alternative divergence measure to the KL-divergence. We argue that defining this divergence measure forms an important, subjective part of any statistical analysis. We here illustrate our method a broad array of divergence measures. We then compare the performance of the different divergence measures for conducting simple inference tasks on both simulated and real data sets, and discuss then how our methods might apply to more complicated, high dimensional models.
READ FULL TEXT