Bayesian Inverse Problems with Heterogeneous Variance

10/15/2019
by   Natalia Bochkina, et al.
0

We consider inverse problems in Hilbert spaces contaminated by Gaussian noise, and use a Bayesian approach to find its regularised smooth solution. We consider the so called conjugate diagonal setting where the covariance operators of the noise and of the prior are diagnolisable in the orthogonal bases associated with the forward operator of the inverse problem. Firstly, we derive the minimax rate of convergence in such problems with known covariance operator of the noise, showing that in the case of heterogeneous variance the ill posed inverse problem can become self regularised in some cases when the eigenvalues of the variance operator decay to zero, achieving parametric rate of convergence; as far as we are aware, this is a striking novel result that have not been observed before in nonparametric problems. Secondly, we give a general expression of the rate of contraction of the posterior distribution in case of known noise covariance operator in case the noise level is small, for a given prior distribution. We also investigate when this contraction rate coincides with the optimal rate in the minimax sense which is typically used as a benchmark for studying the posterior contraction rates. We apply our results to known variance operators with polynomially decreasing or increasing eigenvalues as an example. We also discuss when the plug in estimator of the eigenvalues of the covariance operator of the noise does not affect the rate of the contraction of the posterior distribution of the signal. We show that plugging in the maximum marginal likelihood estimator of the prior scaling parameter leads to the optimal posterior contraction rate, adaptively. Effect of the choice of the prior parameters on the contraction in such models is illustrated on simulated data with Volterra operator.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset