Tunable robustness in power-law inference

01/13/2023
by   Qianying Lin, et al.
0

Power-law probability distributions arise often in the social and natural sciences. Statistics have been developed for estimating the exponent parameter as well as gauging goodness-of-fit to a power law. Yet paradoxically, many famous power laws such as the distribution of wealth and earthquake magnitudes have not found good statistical support in data by modern methods. We show that measurement errors such as quantization and noise bias both maximum-likelihood estimators and goodness-of-fit measures. We address this issue using logarithmic binning and the corresponding discrete reference distribution for maximum likelihood estimators and Kolmogorov-Smirnov statistics. Using simulated errors, we validate that binning attenuates bias in parameter estimates and recalibrates goodness of fit to a power law by removing small errors from consideration. These benefits come at modest cost in statistical power, which can be compensated with larger sample sizes. We reanalyse three empirical cases of wealth, earthquake magnitudes and wildfire area and show that binning reverses statistical conclusions and aligns the statistical results with historical and scientific expectations. We explain through these cases how routine errors lead to incorrect conclusions and the necessity for more robust methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset