Multi-model mimicry for model selection according to generalised goodness-of-fit criteria

11/21/2019
by   Lachlann McArthur, et al.
0

Selecting between candidate models is at the core of statistical practice. As statistical modelling techniques are rapidly evolving, the need for similar evolution in the ways to select between candidate models is increasing. With Akaike's Information Criterion (AIC) and Bayesian Information Criterion (BIC) not applicable for all sets of candidate models, and likelihood not the only criterion upon which one might seek to compare models, a flexible model selection technique called model mimicry has emerged. Using simulation to compare models based upon generalised goodness-of-fit measures, model mimicry is available for any pair of candidate models, so long as (1) the models are able to be fit to observed data, (2) new sets of data can be simulated under the models, and (3) a metric exists by which a dataset's goodness-of-fit to the model can be calculated. In this manuscript, a variation of model mimicry for the simultaneous comparison of multiple models (multi-model mimicry) is outlined, placed in its historical context, and its effectiveness is demonstrated for simple examples. Classification techniques for model selection using output from multi-model mimicry are also explored.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset