Valid Inference Corrected for Outlier Removal
Ordinary least square (OLS) estimation of a linear regression model is well-known to be highly sensitive to outliers. It is common practice to first identify and remove outliers by looking at the data then to fit OLS and form confidence intervals and p-values on the remaining data as if this were the original data collected. We show in this paper that this "detect-and-forget" approach can lead to invalid inference, and we propose a framework that properly accounts for outlier detection and removal to provide valid confidence intervals and hypothesis tests. Our inferential procedures apply to any outlier removal procedure that can be characterized by a set of quadratic constraints on the response vector, and we show that several of the most commonly used outlier detection procedures are of this form. Our methodology is built upon recent advances in selective inference (Taylor & Tibshirani 2015), which are focused on inference corrected for variable selection. We conduct simulations to corroborate the theoretical results, and we apply our method to two classic data sets considered in the outlier detection literature to illustrate how our inferential results can differ from the traditional detect-and-forget strategy. A companion R package, outference, implements these new procedures with an interface that matches the functions commonly used for inference with lm in R.
READ FULL TEXT