EN FR
EN FR


Section: New Results

Uncertainty Quantification methods for uncertainty propagation

Kriging-sparse Polynomial Dimensional Decomposition surrogate model with adaptive refinement

P.M. Congedo, A. Cortesi, G. El Jannoun

In this work, an algorithm for the construction of a low-cost and accurate metamodel is proposed, having in mind redcomputationally expensive applications. It has two main features. First, Universal Kriging is coupled with sparse Polynomial Dimensional Decomposition (PDD) to build a metamodel with improved accuracy. The polynomials selected by the adaptive PDD representation are used as a sparse basis to build a Universal Kriging surrogate model. Secondly, a numerical method, derived from anisotropic mesh adaptation, is formulated in order to adaptively insert a fixed number of new training points to an existing Design of Experiments. The convergence of the proposed algorithm is analyzed and assessed on different test functions with an increasing size of the input space. Finally, the algorithm is used to propagate uncertainties in two high-dimensional real problems related to the atmospheric reentry.

Novel algorithm using Active Metamodel Learning and Importance Sampling: Application to multiple failure regions of low probability

P.M. Congedo, N. Razaaly

Calculation of tail probabilities is of fundamental importance in several domains, such as in risk assessment. One major challenge consists in the computation of low-failure probability in cases characterized by multiple-failure regions, especially when an unbiased estimation of the error is required. Methods developed in literature rely mostly on the construction of an adaptive surrogate, tackling some problems such as the metamodel building criterion and the global computational cost, at the price of a generally biased estimation of the failure probability. In this work, we propose a novel algorithm suitable for low-failure probability and multiple-failure regions, permitting to both building an accurate metamodel and to provide a statistically consistent error. Indeed, an importance sampling technique is used, which is quasi-optimal since permits, by exploiting the knowledge of the metamodel, to provide two unbiased estimators of the failure probability. Additionally, a gaussian mixture-based importance sampling technique is proposed, permitting to drastically reduce the computational cost when estimating some reference values, or the failure probability directly from the metamodel. Several numerical examples are carried out, showing the very good performances of the proposed method with respect to the state-of-the-art in terms of accuracy and computational cost. A physical test-case, focused on the numerical simulation of non-ideal gas turbine cascades, is also investigated to illustrate the capabilities of the method on an industrial case.

Uncertainty propagation framework for systems of codes

P.M. Congedo, F. Sanson, O. Le Maitre

The simulation of complex multi-physics phenomena often requires the use of coupled solvers, modelling different physics (fluids, structures, chemistry, etc) with largely differing computational complexities. We call Systems of Solvers (SoS) a set of interdependent solvers where the output of an upstream solver can be the input of a downstream solvers. In this work we restrict ourselves to weakly coupled problems. A system of solvers typically encapsulate a large number of uncertain input parameters, challenging classical Uncertainty Quantification (UQ) methods such as spectral expansions and Gaussian process models which are affected by the curse of dimensionality. In this work, we develop an original mathematical framework, based on Gaussian Processes (GP) to construct a global metamodel of the uncertain SoS that can be used to solve forward and backward UQ problems. The key idea of the proposed approach is to determine a local GP model for each solver of the SoS. These local GP models are built adaptively to satisfy criteria based on the global output error estimation, which can be decomposed (following an ANOVA-like decomposition) into contributions from individual GP models. This decomposition enables one to select the local GP models that need be refined to efficiently reduce the global error using computer experiment design methods or Bayesian optimization. This framework is then applied to a space object reentry problem.