Empirical bias-reducing adjustments to estimating functions

01/11/2020
by   Ioannis Kosmidis, et al.
0

We develop a novel, general framework for the asymptotic reduction of the bias of M-estimators from unbiased estimating functions. The framework relies on additive, empirical adjustments to the estimating functions that depend only on the first two derivatives of the contributions to the estimating functions. The new estimation method has markedly broader applicability than previous bias-reduction methods by applying to models that are either partially-specified or that have a likelihood that is intractable or expensive to compute, and a surrogate objective is employed. The method also offers itself to easy, general implementations for arbitrary models by using automatic differentiation. This is in contrast to other popular bias-reduction methods that require either resampling or evaluation of expectations of products of log-likelihood derivatives. If M-estimation is by the maximization of an objective function, then, reduced-bias M-estimation can be achieved by maximizing an appropriately penalized objective. That penalized objective relates closely to information criteria based on the Kullback-Leibler divergence, establishing, for the first time, a strong link between reduction of estimation bias and model selection. The reduced-bias M-estimators are found to have the same asymptotic distribution, and, hence, the same asymptotic efficiency properties as the original M-estimators, and we discuss inference and model selection with reduced-bias M-estimates. The properties of reduced-bias M-estimation are illustrated in well-used, important modelling settings of varying complexity.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset