Preconditioners for model order reduction by interpolation and random sketching of operators

04/25/2021
by   Oleg Balabanov, et al.
0

The performance of projection-based model order reduction methods for solving parameter-dependent systems of equations highly depends on the properties of the operator, which can be improved by preconditioning. In this paper we present strategies to construct a parameter-dependent preconditioner by an interpolation of operator's inverse. The interpolation is obtained by minimizing a discrepancy between the (preconditioned) operator and the matrix defining the metric of interest. The discrepancy measure is chosen such that its minimization can be efficiently performed online for each parameter value by the solution of a small least-squares problem. Furthermore, we show how to tune the discrepancy measure for improving the quality of Petrov-Galerkin projection or residual-based error estimation. This paper also addresses preconditioning for the randomized model order reduction methods from [Balabanov and Nouy 2019, Part I]. Our methodology can be readily used for efficient and stable solution of ill-conditioned parametric systems and an effective error estimation/certification without the need to estimate expensive stability constants. The proposed approach involves heavy computations in both offline and online stages that are circumvented by random sketching. The norms of high-dimensional matrices and vectors are estimated by l2-norms of their low-dimensional images, called sketches, through random embeddings. For this we extend the framework from [Balabanov and Nouy 2019, Part I] to random embeddings of operators.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset