On minimum Bregman divergence inference

08/16/2020
by   Soumik Purkayastha, et al.
0

In this paper a new family of minimum divergence estimators based on the Bregman divergence is proposed. The popular density power divergence (DPD) class of estimators is a sub-class of Bregman divergences. We propose and study a new sub-class of Bregman divergences called the exponentially weighted divergence (EWD). Like the minimum DPD estimator, the minimum EWD estimator is recognised as an M-estimator. This characterisation is useful while discussing the asymptotic behaviour as well as the robustness properties of this class of estimators. Performances of the two classes are compared – both through simulations as well as through real life examples. We develop an estimation process not only for independent and homogeneous data, but also for non-homogeneous data. General tests of parametric hypotheses based on the Bregman divergences are also considered. We establish the asymptotic null distribution of our proposed test statistic and explore its behaviour when applied to real data. The inference procedures generated by the new EWD divergence appear to be competitive or better that than the DPD based procedures.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset