Probabilistic learning inference of boundary value problem with uncertainties based on Kullback-Leibler divergence under implicit constraints

02/10/2022
by   Christian Soize, et al.
0

In a first part, we present a mathematical analysis of a general methodology of a probabilistic learning inference that allows for estimating a posterior probability model for a stochastic boundary value problem from a prior probability model. The given targets are statistical moments for which the underlying realizations are not available. Under these conditions, the Kullback-Leibler divergence minimum principle is used for estimating the posterior probability measure. A statistical surrogate model of the implicit mapping, which represents the constraints, is introduced. The MCMC generator and the necessary numerical elements are given to facilitate the implementation of the methodology in a parallel computing framework. In a second part, an application is presented to illustrate the proposed theory and is also, as such, a contribution to the three-dimensional stochastic homogenization of heterogeneous linear elastic media in the case of a non-separation of the microscale and macroscale. For the construction of the posterior probability measure by using the probabilistic learning inference, in addition to the constraints defined by given statistical moments of the random effective elasticity tensor, the second-order moment of the random normalized residue of the stochastic partial differential equation has been added as a constraint. This constraint guarantees that the algorithm seeks to bring the statistical moments closer to their targets while preserving a small residue.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset