EN FR
EN FR


Section: New Results

Uncertainty Quantification methods

Acceleration of Domain Decomposition Methods for Stochastic Elliptic Equations

João F. Reis, Olivier P. Le Maître, Pietro M. Congedo, Paul Mycek

We propose a Monte Carlo based method to compute statistics from a solution of a stochastic elliptic equation. Solutions are computed through an iterative solver. We present a parallel construction of a robust stochastic preconditioner to accelerate the iterative scheme. This preconditioner is built before the sampling, at an offline stage, based on a decomposition of the geometric domain. Once constructed, a realisation of the preconditioner is generated for each sample and applied to an iterative method to solve the corresponding deterministic linear system. This approach is not restricted to a single iterative method and can be adapted to different iterative techniques. We demonstrate the efficiency of this approach with extensive numerical results, divided into two examples. The first example is a one-dimensional equation. The reduced dimension of the first example allows the construction of global operators and consequently, an extensive analysis of the convergence and stability properties of the proposed approach. The second example is an analogous two-dimensional version. We demonstrate the performance of the proposed preconditioner by comparison with other deterministic preconditioners based on the median of the coefficient fields. An article on this topic is under preparation.

Clustering based design of experiments for Systems of Gaussian processes

F. Sanson, O.P. Le Maitre, P.M. Congedo

Multi-physics problems in engineering can be often modeled using a System of Solvers (SoS), which is simply a set of solvers coupled together. SoS could be computationally expensive, for example in parametric studies, uncertainty quantification or sensitivity analysis, so typically requiring the construction of a global surrogate model of the SoS to perform such costly analysis. One recurrent strategy in literature consists of building a system of surrogate models where each solver is approximated with a local surrogate model. This approach can be efficient if good training sets for each surrogate can be generated, in particular on the intermediate variables (which are the outputs of an upstream solver and the inputs of a downstream one) that are a priori unknown. In this work, we propose a novel strategy to construct efficient training sets of the intermediate variables, using clustering-based techniques formulated for a systems of Gaussian processes (SoGP). In this way, improved coverage of the intermediate spaces is attained compared to randomly generated training sets. The performances of this approach are assessed on several test-cases showing that the clustering training strategy is systematically more efficient than randomly sampled training points [19].

Extension of AK-MCS for the efficient computation of very small failure probabilities

N. Razaaly, P.M. Congedo

We consider the problem of estimating a probability of failure pf, defined as the volume of the excursion set of a complex (e.g. output of an expensive-to-run finite element model) scalar performance function J below a given threshold, under a probability measure that can be recast as a multivariate standard Gaussian law using an isoprobabilistic transformation. We propose a method able to deal with cases characterized by multiple failure regions, possibly very small failure probability pf (say 10-6-10-9), and when the number of evaluations of J is limited. The present work is an extension of the popular Kriging-based active learning algorithm known as AK-MCS, permitting to deal with very low failure probabilities. The key idea merely consists in replacing the Monte-Carlo sampling, used in the original formulation to propose candidates and evaluate the failure probability, by a centered isotropic Gaussian sampling in the standard space, which standard deviation is iteratively tuned. This extreme AK-MCS (eAK-MCS) inherits its former multi-point enrichment algorithm allowing to add several points at each iteration step, and provide an estimated failure probability based on the Gaussian nature of the Kriging surrogate. Both the efficiency and the accuracy of the proposed method are showcased through its application to two to eight dimensional analytic examples, characterized by very low failure probabilities. Numerical experiments conducted with unfavorable initial Design of Experiment suggests the ability of the proposed method to detect failure domains [30].

An Efficient Kriging-Based Extreme Quantile Estimation suitable for expensive performance function

N. Razaaly, P.M. Congedo

We propose here a method for fast estimation of the quantiles associated with very small levels of probability, where the scalar performance function J is complex (e.g. output of an expensive-to-run finite element model), under a probability measure that can be recast as a multivariate standard Gaussian law using an isoprobabilistic transformation. A surrogate-based approach (Gaussian Processes) combined with adaptive experimental designs allows to iteratively increase the accuracy of the surrogate while keeping the overall number of J evaluations low. Direct use of Monte-Carlo simulation even on the surrogate model being too expensive, the key idea consists in using an Importance Sampling method based on an isotropic centered Gaussian with large Standard deviation permitting a cheap estimation of the quantiles of the surrogate. Similar to the strategy presented by Schobi and Sudret (2016), the surrogate is adaptively refined using a parallel infill refinement of an algorithm suitable for very small failure probability. We finally elaborate a multi-quantile selection approach allowing to exploit high-performance computing architectures further. We illustrate the performances of the proposed method on several two and six-dimensional cases. Accurate results are obtained with less than 100 evaluations of J. An article on this topic is under preparation.