Section: Application Domains
Application: uncertainties management
Our theoretical works are motivated by and find natural applications to real-world problems in a general frame generally referred to as uncertainty management, that we describe now.
Since a few decades, modeling has gained an increasing part in complex systems design in various fields of industry such as automobile, aeronautics, energy, etc. Industrial design involves several levels of modeling: from behavioural models in preliminary design to finite-elements models aiming at representing sharply physical phenomena. Nowadays, the fundamental challenge of numerical simulation is in designing physical systems while saving the experimentation steps.
As an example, at the early stage of conception in aeronautics, numerical simulation aims at exploring the design parameters space and setting the global variables such that target performances are satisfied. This iterative procedure needs fast multiphysical models. These simplified models are usually calibrated using high-fidelity models or experiments. At each of these levels, modeling requires control of uncertainties due to simplifications of models, numerical errors, data imprecisions, variability of surrounding conditions, etc.
One dilemma in the design by numerical simulation is that many crucial choices are made very early, and thus when uncertainties are maximum, and that these choices have a fundamental impact on the final performances.
Classically, coping with this variability is achieved through model registration by experimenting and adding fixed margins to the model response. In view of technical and economical performance, it appears judicious to replace these fixed margins by a rigorous analysis and control of risk. This may be achieved through a probabilistic approach to uncertainties, that provides decision criteria adapted to the management of unpredictability inherent to design issues.
From the particular case of aircraft design emerge several general aspects of management of uncertainties in simulation. Probabilistic decision criteria, that translate decision making into mathematical/probabilistic terms, require the following three steps to be considered [50] :
-
build a probabilistic description of the fluctuations of the model's parameters (Quantification of uncertainty sources),
-
deduce the implication of these distribution laws on the model's response (Propagation of uncertainties),
-
and determine the specific influence of each uncertainty source on the model's response variability (Sensitivity Analysis).
The previous analysis now constitutes the framework of a general study of uncertainties. It is used in industrial contexts where uncertainties can be represented by random variables (unknown temperature of an external surface, physical quantities of a given material, ... at a given fixed time). However, in order for the numerical models to describe with high fidelity a phenomenon, the relevant uncertainties must generally depend on time or space variables. Consequently, one has to tackle the following issues:
-
How to capture the distribution law of time (or space) dependent parameters, without directly accessible data? The distribution of probability of the continuous time (or space) uncertainty sources must describe the links between variations at neighbor times (or points). The local and global regularity are important parameters of these laws, since it describes how the fluctuations at some time (or point) induce fluctuations at close times (or points). The continuous equations representing the studied phenomena should help to propose models for the law of the random fields. Let us notice that interactions between various levels of modeling might also be used to derive distributions of probability at the lowest one.
-
The navigation between the various natures of models needs a kind of metric which could mathematically describe the notion of granularity or fineness of the models. Of course, the local regularity will not be totally absent of this mathematical definition.
-
All the various levels of conception, preliminary design or high-fidelity modelling, require registrations by experimentation to reduce model errors. This calibration issue has been present in this frame since a long time, especially in a deterministic optimization context. The random modeling of uncertainty requires the definition of a systematic approach. The difficulty in this specific context is: statistical estimation with few data and estimation of a function with continuous variables using only discrete setting of values.
Moreover, a multi-physical context must be added to these questions. The complex system design is most often located at the interface between several disciplines. In that case, modeling relies on a coupling between several models for the various phenomena and design becomes a multidisciplinary optimization problem. In this uncertainty context, the real challenge turns robust optimization to manage technical and economical risks (risk for non-satisfaction of technical specifications, cost control).
We participate in the uncertainties community through several collaborative research projects (ANR and Pôle SYSTEM@TIC), and also through our involvement in the MASCOT-NUM research group (GDR of CNRS). In addition, we are considering probabilistic models as phenomenological models to cope with uncertainties in the DIGITEO ANIFRAC project. As explained above, we focus on essentially irregular phenomena, for which irregularity is a relevant quantity to capture the variability (e.g. certain biomedical signals, terrain modeling, financial data, etc.). These will be modeled through stochastic processes with prescribed regularity.