Convergence Rates for Bayesian Estimation and Testing in Monotone Regression
Shape restrictions such as monotonicity on functions often arise naturally in statistical modeling. We consider a Bayesian approach to the problem of estimation of a monotone regression function and testing for monotonicity. We construct a prior distribution using piecewise constant functions. For estimation, a prior imposing monotonicity of the heights of these steps is sensible, but the resulting posterior is harder to analyze theoretically. We consider a “projection-posterior” approach, where a conjugate normal prior is used, but the monotonicity constraint is imposed on posterior samples by a projection map on the space of monotone functions. We show that the resulting posterior contracts at the optimal rate n^-1/3 under the L_1-metric and at a nearly optimal rate under the empirical L_p-metrics for 0<p≤ 2. The projection-posterior approach is also computationally more convenient. We also construct a Bayesian test for the hypothesis of monotonicity using the posterior probability of a shrinking neighborhood of the set of monotone functions. We show that the resulting test has a universal consistency property and obtain the separation rate which ensures that the resulting power function approaches one.
READ FULL TEXT