A Bootstrap-Assisted Self-Normalization Approach to Inference in Cointegrating Regressions

04/04/2022
by   Karsten Reichold, et al.
0

Traditional inference in cointegrating regressions requires tuning parameter choices to estimate a long-run variance parameter. Even in case these choices are "optimal", the tests are severely size distorted. We propose a novel self-normalization approach, which leads to a nuisance parameter free limiting distribution without estimating the long-run variance parameter directly. This makes our self-normalized test tuning parameter free and considerably less prone to size distortions at the cost of only small power losses. In combination with an asymptotically justified vector autoregressive sieve bootstrap to construct critical values, the self-normalization approach shows further improvement in small to medium samples when the level of error serial correlation or regressor endogeneity is large. We illustrate the usefulness of the bootstrap-assisted self-normalized test in empirical applications by analyzing the validity of the Fisher effect in Germany and the United States.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset