EN FR
EN FR


Section: New Results

Uncertainty Quantification and robust design optimization

  • Participants: Andrea Cortesi , Pietro Marco Congedo, Nassim Razaaly, Sanson Francois

  • Corresponding member: Pietro Marco Congedo

Concerning Uncertainty Quantification techniques, we have worked in three main directions. First, we developed novel techniques for building efficient and low-cost surrogate. In [43], two main points are introduced. Firstly, a technique which couples Universal Kriging with sparse Polynomial Dimensional Decomposition (PDD) to build a metamodel with improved accuracy. The polynomials selected by the adaptive PDD representation are used as a sparse basis to build an Universal Kriging surrogate model. The second is a strategy, derived from anisotropic mesh adaptation, to adaptively add a fixed number of new training points to an existing Design of Experiments. Moreover, we have explored in [44] how active subspaces are used to find a low-dimensional dependence structures in the input-to-output map of the forward numerical solver. Then, surrogate models on the active variables are used to accelerate the forward uncertainty propagation by Monte Carlo sampling and the Markov Chain Monte Carlo sampling of the posterior distribution for Bayesian inversion. Then, the forward and backward methodologies are applied to the simulation of a hypersonic flow around a cylinder, in conditions for which experimental data are available, revealing new insights towards the potential exploitation of heat flux data for freestream rebuilding.

The second action has been oriented towards the development of efficient techniques for computing low-probability estimations. In [50], we have proposed a novel algorithm permitting to both building an accurate metamodel and to provide a statistically consistent error. In fact, it relies on a novel metamodel building strategy, which aims to refine the limit-state region in all the branches "equally", even in the case of multiple failure regions, with a robust stopping building criterion. Additionally, another importance sampling technique is proposed, permitting to drastically reduce the computational cost when estimating some reference values, or when a very weak failure-probability event should be computed directly from the metamodel.

Third, we have worked on the propagation of uncertainties through systems of solvers [25]. A System of Solvers (SoS) is a set of interdependent solvers where an output of an upstream solver can be the input of downstream solvers. In this work, we restrict ourselves to directed SoS with one-way dependences between solvers. Performing Uncertainty Quantification (UQ) analysis in SoS is challenging because it typically encapsulates a large number of uncertain input parameters and classical UQ methods, such as spectral expansions and Gaussian process models, are affected by the curse of dimensionality. In this work, we develop an original mathematical framework, based on Gaussian Process (GP) models to construct a global surrogate model of the uncertain SoS, that can be used to solve forward and backward UQ problems. The key idea of the proposed approach is to determine a local GP model for each solver constituting the SoS. These local GP models are built adaptively to satisfy criteria based on the global output error estimation. The error estimate can be decomposed into contributions from the individual GP models, enabling one to select the GP models to refine to efficiently reduce the global error. The framework is first tested on several analytical problems and subsequently applied to space object reentry simulations.

Concerning optimization under uncertainties, we have worked on the formulation of novel framework to perform multi-objective optimization [24], when considering an error on the objective functions. In many engineering optimization problems, the objective functions are affected by an error arising from the model employed for the computation of the functions. For example, in the case of uncertainty-based optimization the objective functions are statistics of a performance of interest which is uncertain due to the variability of the system input variables. These estimated objectives are affected by an error, which can be modeled with a confidence interval. The framework proposed here is general and aims at dealing with any error affecting a given objective function. The strategy is based on the extension of the Bounding-Box concept to the Pareto optima, where the error can be regarded with the abstraction of an interval (in one-dimensional problems) or a Bounding-Box (in multi-dimensional problems) around the estimated value. This allows the computation of an approximated Pareto front, whose accuracy is strongly dependent on the acceptable computational cost. This approach is then supplemented by the construction of an evolutive surrogate model on the objective functions, iteratively refined during the optimization process. This allows ultimately to further reduce the computation cost of the Pareto front with approximations of the objective functions at a negligible cost. Regarding optimization, we have also worked on the formulation of a novel optimization under uncertainty framework for the definition of optimal shapes for morphing airfoils, applied here to advancing/retreating 2D airfoils. In particular, the morphing strategy is conceived with the intent of changing the shape at a given frequency to enhance aerodynamic performance. The optimization of morphing airfoils pre- sented here only takes into account the aerodynamic performance. The paper [5] is then focused on an aerodynamic optimization to set the optimal shape with respect to performance, where technological aspects are inserted through geometrical constraints.