2024Activity reportProject-TeamPLATON
RNSR: 202023682J- Research center Inria Saclay Centre
- In partnership with:CNRS
- Team name: Uncertainty Quantification in Scientific Computing and Engineering
- In collaboration with:Centre de Mathématiques Appliquées (CMAP)
- Domain:Applied Mathematics, Computation and Simulation
- Theme:Numerical schemes and simulations
Keywords
Computer Science and Digital Science
- A3.4.1. Supervised learning
- A3.4.2. Unsupervised learning
- A3.4.5. Bayesian methods
- A3.4.7. Kernel methods
- A6. Modeling, simulation and control
- A6.1. Methods in mathematical modeling
- A6.1.1. Continuous Modeling (PDE, ODE)
- A6.1.2. Stochastic Modeling
- A6.1.4. Multiscale modeling
- A6.1.5. Multiphysics modeling
- A6.2. Scientific computing, Numerical Analysis & Optimization
- A6.2.1. Numerical analysis of PDE and ODE
- A6.2.4. Statistical methods
- A6.2.6. Optimization
- A6.2.7. High performance computing
- A6.3. Computation-data interaction
- A6.3.1. Inverse problems
- A6.3.2. Data assimilation
- A6.3.3. Data processing
- A6.3.4. Model reduction
- A6.3.5. Uncertainty Quantification
- A6.5.1. Solid mechanics
- A6.5.2. Fluid mechanics
Other Research Topics and Application Domains
- B3. Environment and planet
- B3.3. Geosciences
- B4. Energy
- B4.2. Nuclear Energy Production
- B4.3. Renewable energy production
- B4.3.3. Wind energy
- B4.3.4. Solar Energy
- B5. Industry of the future
- B5.2. Design and manufacturing
- B5.2.1. Road vehicles
- B5.2.2. Railway
- B5.2.3. Aviation
- B5.2.4. Aerospace
- B5.5. Materials
- B5.7. 3D printing
- B5.9. Industrial maintenance
1 Team members, visitors, external collaborators
Research Scientists
- Pietro Marco Congedo [Team leader, INRIA, Senior Researcher]
- Enora Denimal Goy [INRIA, Researcher]
- Olivier Le Maitre [CNRS, Senior Researcher]
PhD Students
- Meryem Benmahdi [DASSAULT SYSTEMES, CIFRE]
- Michele Capriati [INSTITUT VKI, until Jan 2024]
- Erwan Dehillerin [INRIA, from Nov 2024]
- Hugo Dornier [ONERA]
- Marius Duvillard [CEA]
- Sanae Janati Idrissi [CEA]
- Zachary Jones [INRIA]
- Omar Kahol [ECOLE POLY PALAISEAU]
- Hugo Masson [UNIV GUSTAVE EIFFEL]
- Hugo Nicolas [INRIA]
- Christos Papagiannis [CNRS]
Technical Staff
- Hanane Khatouri [INRIA, Engineer]
Interns and Apprentices
- Lin Xi Li [INRIA, Intern, until Jan 2024]
- Vittorio Piro [INRIA, Intern, from May 2024 until Jul 2024]
Administrative Assistant
- Anna Dib [INRIA]
Visiting Scientist
- Marylou Gabrie [ECOLE POLY PALAISEAU]
External Collaborators
- Michele Capriati [INSTITUT VKI, from Feb 2024]
- Vittorio Piro [von Karman Institute for Fluid Dynamics, from Aug 2024]
2 Overall objectives
Computational approaches in science and engineering rely on numerical tools to produce effective, robust, and high fidelity predictions through the simulation of complex physical systems. The design and development of simulation tools encompass numerous aspects, ranging from the initial mathematical formulation of the problem to its actual numerical resolution, including the design of numerical algorithms suited to computational architectures of modern supercomputers, in particular massively parallel machines.
To fully achieve the promises of numerical simulations in sciences and engineering, it is essential to assess and improve their predictive capabilities continuously. Obvious improvements concern the modeling aspects (higher fidelity) and numerical efficiency (to enable higher resolution). However, as the computational capabilities are progressing, it is becoming more and more evident that accounting for the various uncertainties involved in the simulation process is critical. The reason is that the accurate simulation of a complex system has a practical utility if, and only if, one can prescribe with sufficient precision the system investigated. In other words, obtaining high fidelity predictions on a system different from the one targeted present limited interest. The problem here is that, except for purely academic situations, specifying precisely all the properties and forcing applied to a complex system is impossible. Whether the precise definition of the system is impossible because of inherent variabilities, lack of knowledge, or imprecise calibration procedures (experimental setups and measurements are inherently inexact), reducing totally uncertainty sources is not an option. As a result, the simulation should account for these uncertainties and quantify their impact on the predictions (similarly to the experimental error characterization) in order to assess objectively the truthfulness of the simulation and enable fully informed decision making. As a matter of fact, reliable numerical predictions require both sophisticated physical models and the systematic and comprehensive treatment of inherent uncertainties, including the calibration and validation procedures. Coarsely, the prediction errors result from physical simplifications in the mathematical model, numerical errors incurring from the discretization and numerical methods (solvers), and uncertainties in the definition of the model to be solved (input uncertainties).
Uncertainty management procedures are often tailored to the particular problem and application considered. In our experiences, it is hard to conceive a systematic a priori approach suitable for all problems. Most often, the UQ analysis consists in the gradual (re)definition and extension of its objectives , which can be somewhat vague initially. It is, therefore, crucial to have a large portfolio of diverse numerical methods to quickly propose and apply suitable treatments in response to the evolving understanding and needs as they emerge during the analysis.
The global objective of the research proposed within Platon is to develop advanced numerical methods and practices in simulations, integrating as much as possible the uncertainty management. Here, uncertainty management encompasses multiple uncertainty tasks: a) uncertainty characterization (the construction and identification of uncertainty models), b) uncertainty propagation (computation of the model-based prediction uncertainty), c) uncertainty reduction (by inference, data assimilation, conception of new experiments either physical or numerical,...) and d) uncertainty treatment in decision-making processes (sensitivity analysis, risk management, robust optimization,...). Note that one should not perceive these different uncertainty tasks as reflecting an ordered sequence of analysis steps. On the contrary, our vision and experience value a strong interaction between all these tasks which, ideally, must be visited in an order commanded by the initial information, the progress of the analysis and the resources available.
Progressing on all these tasks constitutes a significant challenge as the tasks involve a diversity of thematics and skills. This difficulty is prominent in the context of large scale simulations, where practitioners and researchers tend to be highly specialized in specific aspects (modeling, numerical schemes, parallel computing,...). Further, more massive simulations are often confused with better predictions and they overshadow the importance of uncertainties. At the same time, high simulation costs usually prevent applying straightforward uncertainty analyses as, for a fixed budget, one often prefers a simulation at the highest affordable resolution, rather than performing uncertainty analysis involving possibly less resolved simulations. However, this preference is most often not based on an objective assessment of the situation. In contrast, we believe that using complex models and exploiting fairly the predictions of large scale simulations need suitable uncertainty management procedures. Further, we are convinced of the importance of a research effort encompassing as much as possible all uncertainty tasks, to ensure the coherence and mutual relevance of the methods developed. Such an effort focusing on uncertainty management, rather than on a particular application, will be critical to improving the predictive capabilities of simulation tools and address industrial and societal needs.
Therefore, the main objectives of the team will be :
- Propose new methods and approaches for uncertainty management.
- Develop these methods into numerical tools applicable to large scale simulations.
- Apply and demonstrate the impact of uncertainty management in real applications with industrial and academic partners.
To achieve these objectives, we rely on the expertise and past researches of the permanent members, which cover most of the uncertainty tasks (propagation, inference, reduction, optimization,...), although not in a comprehensive way so far. The development of new predictive simulation tools also relies on collaborations, mainly within the international academic network that we have established over the past 15 years and within the Centre de Mathématiques Appliquées de l'École Polytechnique. The development of useful uncertainty management frameworks applicable to large scale simulations demand constant interactions with end-users (engineers, practitioners, researchers); we rely on our current network of industrial partners and EPICs1 and extend it progressively.
3 Research program
The Team approach to research will be bottom-up: starting from new ideas and concepts to address both existing (known) and emerging (anticipated or not) problems. The later point, concerning the emerging problems, is particularly important in a quickly evolving research area with the constant improvement of the methodological and computational capabilities. The research thrust will be structured along two principal directions: a methodological axis and an applications axis.
3.1 UQ methodologies and tools
The Team will continuously work on developing original UQ representations and algorithms to deal with complex and large scale models, having high dimensional input parameters with complexes influences. We plan to organize our core research activities along different methodological UQ developments related to the challenges discussed above.
3.1.1 Surrogate modeling for UQ
Challenges. Surrogate models are crucial to enable the solution of both forward and backward UQ problems. Several alternative approaches, such as Polynomial Chaos, Gaussian Processes, and tensor format approximation, have been proposed and developed over the last decades. These approaches have been successfully applied to many different domains. Still, surrogates models for UQ management are facing many remaining limitations that require significant research works to handle large scale simulation-based studies and account for complex dependencies. These limitations concern multiple aspects, including the complexity related to the dimensionality of the input parameter, the definition of suitable basis representations, the complexity of the surrogate construction, and the control of the surrogate error.
Proposed actions. Platon will pursue long time efforts in the continuity of previous developments, such as the improvement of advanced sparse grid methods, sparsity promoting strategies and low-rank methods. Besides these generic developments, a first research axis will focus on the construction of surrogates for multi-physics problems (fluids, structures, chemistry,...) simulated by a system of coupled solvers. Classical surrogate methods consider the system of solvers as a single entity, and their construction requires the complete simulation with a high cost as a result. In contrast, we are proposing a divide to simplify strategy, using a surrogate of each constitutive solver, which reduces the input dimensionality of the local models, enables parallel construction and more flexible control of the computational effort. We will have to derive suitable error estimates of the contributions of the individual solver and procedures to decide the new computer experiments to reduce the error optimally. A second research axis on surrogate models will concern complexity reduction using transformation methods. Transformations can act on the input or output spaces of the model. In the first case, dimensionality reduction is achieved by finding low dimensional subspaces of the input space that convey most of the output variability. Platon will extend these methodologies to incorporate non-linear subspaces and alternative importance measures, in particular, to account for the surrogate's final usage (goal-oriented reduction). For the reduction of the output, we will consider generalizations of the preconditioning approach, which transforms the model output to a form admitting a much simpler surrogate and implicit enforcement of physical constraints. Here, the main challenges will be the automatic selection of the transformation among a dictionary and the design of computer experiments in this context (see below).
3.1.2 Uncertainty model, information theory and inference
Challenges. Uncertainty management in simulation can be considered in its infancy, and the control of the whole process, from the definition of the uncertainty model to the design of new simulations or experiments for uncertainty reduction, is still facing multiple challenges. Most past works on UQ have focused on forward-propagation and inverse problems when, in contrast, input uncertainty models and uncertainty reduction strategies, in general, have received much less attention.
Proposed actions. The uncertainty model directly affects the conclusion of UQ analyses (e.g., sensitivity analyses, estimation of failure probabilities, rare events). Therefore, it is crucial to propose uncertainty models that consistently and objectively integrate all available information and expert knowledge(s). Platon will explore the application of maximum entropy principle, likelihood maximization and moment matching methods, for the construction of uncertainty models in engineering problems.
For the inverse problem, the Team will continue its efforts in Bayesian inference toward better treatment of the model error in the calibration procedure.
Concerning uncertainty reduction, a central question is the prediction of the improvement toward the specific objective brought by a new simulation (computer experiment). Platon will investigate different strategies of design of experiment (DoE) based on measures of the improvement, such as entropy reduction, besides the classical reduction of variance.
The DoE in inference consists in proposing new physical experiments to reduce the posterior uncertainty optimally. Optimizing information gain leads to expensive numerical procedures, and suitable model error and noise models are critical to ensure the robustness of these optimal DoE procedures when applied to real-life data. Platon will work on approximation and reduction methods for optimal DoE to enable applications in large-scale engineering problems; the extension of the optimization to uncertainty reduction in general model-prediction, not just the model parameter uncertainty.
3.1.3 Multi-fidelity, Multi-level and optimization under uncertainty
Challenges. Multi-fidelity and Multi-level (MF&L) methods have been proposed to reduce the cost of surrogate model construction or statistics estimations, by relying on simulators of different complexity (in the modeled physics, discretization, or both). Although these methods have proved to be effective, particularly in the context of expensive simulations, existing algorithms must be adapted to other tasks. MF&L strategies are also missing in Robust Optimization (RO) and Reliability-Based Optimization (RBO), where one has to evaluate the objective accurately, typically some statistics of the model output (moments, quantiles, ...).
Proposed action. The Team Platon will explore MF&L approaches and the design of computer experiments to obtain the best estimation at the lowest cost (or for a prescribed computational budget) for nontrivial goal, specifically optimization and reliability problems where the accuracy needed is not uniform, possibly unknown a priori and to be estimated as the construction proceeds.
In RO and RBO, our research will focus on the estimation of robustness and reliability measures with tunable fidelity to adapt the convergence of the statistics to the advancement of the optimization procedure. Platon will include MF&L in the so-called bounding-box approach to track the level of error in the statistical estimates. Another research axis will focus on alternative estimation methods, e.g. the Quantile Bayesian regression, to include MF&L features.
3.1.4 HPC and UQ problems
Challenges. Both intrusive and non-intrusive UQ methods are associated to large computational costs, ranging from several to millions of times the cost of a deterministic solution depending on the problem and task considered. This situation is a significant obstacle to the deployment of UQ analysis in large scale simulations, and computational aspects have been central for a long time. However, works concerning the exploitation of High-Performance Computing platforms with massive parallelism are still scarce, besides the trivial parallelism of some sampling methods (e.g., Monte Carlo). Further, past efforts have concerned the formulation of the stochastic problem and relied on existing advanced solution methods (e.g. Domain Decomposition, linear algebra libraries, parallelism). However, few works have fully considered exploiting stochastic structures and HPC aspects to design novel computational strategies fully dedicated to UQ problems.
Proposed actions. Platon will continue to develop solvers for the resolution of multiple large systems resulting from the discretization of sampled stochastic problems. In particular, we shall focus on linear and non-linear (Newton-like) solvers, exchanging information (Krylov spaces) between successive solves to improve convergence rates of iterative methods. Besides the extension to non-linear problems, the work will focus on the implementation aspect and consider communication strategies when several instances of the random system are solved in parallel.
Platon will continue to develop specific domain decomposition methods for stochastic problems, and to propose effective stochastic preconditioners exploiting the independence of the local (uncertain) sub-problems. An additional but critical point concerns the association of adaptive mesh refinement (AMR, in space) with multiple resolution analysis (MRA, in the parameters) methods. Few works have solved UQ problems with deterministic AMR, and combining the two adaptive approaches within a parallel framework remains challenging; progress in this direction would enable efficient intrusive solvers for conservation laws.
4 Application domains
In this section, we provide some examples of UQ problems with industrial interests. We believe they are illustrative of how we envision interactions and knowledge transfer with industrial partners. These examples involve industrial and academic partnerships, with active projects and contracts.
4.1 Simulation of space objects
Challenges. The French Aerospace industry is facing enormous technological challenges in a highly competitive market. We focus on two relevant problems, i.e. the design of the booster of the Ariane6 launch vehicle and the atmospheric reentry of space vehicles or satellites. The launch vehicle's structure sustains severe mechanical and thermal stresses during the ignition stage, which are challenging to model accurately. Therefore, the design still relies heavily on experimental measurements and safety margins, when a better account of model uncertainty would help improve the design procedures. Concerning the atmospheric reentry, recent regulations impose the reentry of a human-made end-of-life space object with a rigorous assessment of the risk for human assets. The risk evaluation requires sequences of complex numerical simulations accounting for the multi-physics phenomena occurring during the reentry of a space object, e.g., fluid-structure interactions and heat transfer. Further, these simulations are inaccurate because they rely on overly simplified models (e.g., a reliable model of fragmentation is not available yet) and partial knowledge of the reentry conditions.
4.2 Predictive simulation of complex flows in nuclear reactors
Challenges. In the nuclear field, a systematic issue is that the calibration and validation of the mathematical model use experimental data measured on devices that are scaled versions of the actual design. One expects the scaled models to exhibit the same physics as the actual design, although the two operate in different conditions. Because of prohibitive computational cost, only parts of the reactor can be simulated with computational-fluid-dynamics (CFD) models. An open question is then how to accurately estimate the global prediction error associated with the resulting numerical model. The long-term objective in this field is to perform a so-called up-scaling approach, integrate simulations of different parts of the reactor and available experiments in scaled and actual designs, and improve the global predictive capability of the simulation and support the decision regarding new experiments.
4.3 Robust design for renewable energy sources
ORCs turbines - Challenges. Organic Rankine Cycles (ORCs) are of key-importance in renewable energy systems. The thermodynamic properties of the organic fluids present technological advantages for low-grade heat sources, e.g. geothermal, solar, or industrial waste. The use of these systems in different physical locations worldwide and with different heat source conditions implies large variability in the turbine's operating conditions. For this reason, ORCs manufacturers are highly interested in evaluating the variability in the system efficiency and, eventually, in the robust designing of the turbines. Moreover, the molecular complexity of organic fluids requires sophisticated thermodynamic models. Nevertheless, the scarcity of experimental data makes hard the calibration of both thermodynamic models and parameters (among other critical properties, acentric factor), as well as the inference of a suitable turbulence model.
Wind Turbines – Challenges. With the anticipated increase in the number of wind farms in the coming years, ensuring the structural integrity of each turbine while minimizing human intervention to reduce maintenance costs is crucial. A key challenge lies in the detection, localization, and quantification of faults in wind turbines using data collected during operation, along with available numerical models. Although modeling capabilities have improved in recent years, the multi-physics nature of the problem, combined with its structural complexity, has limited prediction accuracy. Many unknown or uncertain properties within the system, particularly related to material characteristics and geometric configurations, contribute to discrepancies between model simulations and actual turbine behavior, making prediction and model calibration difficult.
4.4 Uncertainty and inference in geosciences
Challenges. Uncertainty and inference are crucial in geosciences where all prediction is affected by lack of knowledge, imprecise calibration, and model error. It is essential to make the best use of the available information and objectively account for the actual state of knowledge. Besides, depending on the application, experimental observations can be very scarce or highly abundant, models can be crude or highly sophisticated, such that different methods are needed to adapt to the context. Further, these methods should ideally consider all sources of error (data error, calibration uncertainty, model error, numerical error) globally to balance them and ensure that resources are properly allocated to improve the prediction. For these reasons, Platon will continue to work on methodologies for applications in geosciences.
4.5 UQ Methods for the Design and Monitoring of Vibrating Structures
Challenges. Uncertainty management is crucial in the design and monitoring of vibrating structures, such as aircraft, turbines, space applications, and nuclear systems. Indeed, all predictions are affected by unknown, varying operating conditions, manufacturing variability, modeling errors, etc. This issue becomes even more critical when nonlinear behaviours are involved. These vibrating systems exhibit complex dynamic behaviours, with wide frequency ranges, mode coupling, and multiple solutions, for which the impact of uncertainties is not yet fully understood or controlled. For these reasons, industry is highly interested in studying dynamic variability due to uncertainties, with the goal of proposing robust designs and reliable calibration indicators. The complexity and computational cost of numerical solvers, combined with the large scale of the models, make it essential to develop efficient, dedicated tools for robust optimization and uncertainty propagation. Future challenges will also focus on integrating multi-scale and multi-physics solvers.
4.6 Robust Topology Optimization for Efficient Additive Manufacturing
Challenges. The design of mechanical structures (in fields such as transportation, energy, aerospace, space exploration, etc.) is undergoing significant changes with the advent of additive manufacturing. The ability to produce complex shapes and geometries with additive manufacturing enables the design of lightweight and efficient structures. With this technology, topology optimization has gained popularity as an effective method for identifying optimal geometries. However, these optimal shapes are highly specialized (in terms of efficiency) and are often very thin with numerous holes, making them inherently sensitive to uncertainties related to manufacturing, environmental factors, and modelling errors. Addressing uncertainties in robust topology optimization is critical and requires advanced numerical tools due to the high computational cost and the problem's high dimensionality.
4.7 Research plan
Most of the actions proposed above are either initiated or planned to start shortly. They are organized and structured around Ph.D. and Post-Doc research activities and will not exceed the duration of the project. Apart from these actions, we will continuously conduct more exploratory research activities to improve, for instance, the treatment of (structural) model errors in uncertainty management, assess the potential application of machine learning algorithms to UQ, and advance toward holistic management of uncertainties.
5 Social and environmental responsibility
5.1 Impact of research results
Pollution reduction in commercial aircrafts
In EASA's 2019 annual report, in-flight icing was identified as a priority 1 issue for large aeroplanes. Therefore, to comply with certification rules, airframes and engine manufacturers must demonstrate safe operation under icing conditions which leads to significant costs before the new product is put into service. Wind tunnel tests and flight tests in icing conditions are usually required due to the low confidence of certification authorities place in simulations, due to the complexity of the icing process.
A breakthrough, leading to a reduction of time-to-market and certification costs, would be obtained by creating a consensus among certification authorities about the reliability of simulation tools for predicting in-flight ice accretion and the operation of IPS.
TRACES is a European Joint Doctorate network whose main goal is to provide high-level training in the field of in-flight icing to deliver a new generation of high achieving Doctoral Researchers (DR) in the diverse disciplines necessary for mastering the complexity of ice accretion and its mitigation in aircraft and aeroengines. In TRACES, Platon is developing novel methods to assess the calibration procedure by detecting potential inaccuracies of icing model and to perform an uncertainty quantification study propagating systematically the posterior distribution of each model's parameter.
Renewable energy sources
Platon is involved in the development of advanced numerical tools to simulate Organic Rankine Cycles (ORCs), which are of key-importance in renewable energy systems. Specifically, we are working on the inference of thermodynamic models parameters for complex molecular compounds, using experimental data of the worldwide first facility at Politecnico di Milano. Secondly, we are developing a robust optimization framework for the shape design of ORC turbines.
6 Highlights of the year
- KO of the Matritime project in September 2024.
- New funded projects: ANR JCJC, MEDITWIN, PREMYOM.
- Organization of the Winter of Codes, TRACES EU Project, Inria Center of Saclay, December 9-14 2024.
7 New software, platforms, open data
7.1 New software
7.1.1 Stocholm
-
Name:
Stocholm
-
Keyword:
Uncertainty quantification
-
Functional Description:
Stocholm is a numerical library permitting to respond to potential partners swiftly and draft UQ solutions addressing new questions. It includes Polynomial Chaos construction, manipulation, and algebra, adaptive sparse grid methods for integration, interpolation, and projection in high-dimension, stochastic multi-resolution analysis tools with error estimators, advanced regression methods with regularization techniques and Gaussian process modeling, sampling methods with LHS, QMC and Markov Chain Monte Carlo algorithms, Bayesian inference framework and fast density estimation methods, Bayesian optimization algorithms with robust and multi-objective strategies, ...
-
Release Contributions:
We will continue integrating existing tools and new ones into the library StochOlm (C++), the most general one, to allow for maximum interoperability of the constitutive utilities. Having a unique library shared by the whole group also presents some interest for students and new researchers joining the Team, as they can benefit from the others’ experience.
-
Contact:
Olivier Le Maitre
-
Partners:
CNRS, Ecole Polytechnique
8 New results
8.1 Research axis 1: Uncertainty Quantification and Inference
Participants: P.M. Congedo, O. Le Maître, M. Capriati, M. Duvillard, M. Benmahdi, S. Idrissi, O. Kahol, H. Khatouri.
Project-team positioning
Many research groups are presently working on Uncertainty Quantification (UQ) and inference problems over the world and in France. For instance, the US has created and continues to expand large multi-disciplinary groups to address UQ challenges in energy and military domains through their national laboratories (SANDIA, Oak-Ridge, LLNL,...). These groups aim at providing generic methods and tools (mostly software) for the resolution of UQ problems (for example, the Dakota code from Sandia-Albuquerque) faced by other research groups from diverse application domains. Other countries are supporting smaller initiatives, including the CEA (civil and military) in France. Several large industrial groups, such as Bosch, EADS, or EdF, are also deploying UQ methodologies and tools (for example, the OpenTurns code from EADS/EDF) through dedicated RD units or services, responding to the demands of other services. These UQ activities have often emerged in well-established groups working in specific application domains (e.g., fluid dynamics, solid mechanics, electromagnetics, chemistry, material sciences, earth sciences, life sciences, ...), in response to some UQ aspects related to these particular domains. We cite G. Iaccarino (Uncertainty Quantification Lab within the Center for Turbulence Research, Stanford University), Y. Marzouk (Aerospace Computational Design Laboratory, MIT) and K. Wilcox (Institute for Computational Engineering and Sciences, University of Texas). The situation is globally similar in applied mathematics, where several groups develop advanced UQ methods within a broader research area (e.g., stochastic numerics, statistics, numerical analysis,...), sometimes with only a distant connection to engineering domains. For example, we can mention the research groups of M. Giles (Oxford), I. Bilionis (Purdue University), J. Garnier (Ecole Polytechnique), R. Abgrall (University of Zurich).
The objective of Platon is to team-up participants with the main interest in the development of UQ methodologies. While primarily targeting our current applications, our objective is to propose new applications through collaborations and progressive team development while maintaining the UQ as the project's identity. This strategy gives a somehow unique position of the Team within the national and international research landscapes. As far as computational mechanics and engineering are concerned, no group has been created with UQ management as the principal working area.
Then, the identity of Platon is to be contrasted with initiatives, including within Inria, which may have a UQ component, but within different methodological contexts and not as a central activity. For instance, some teams (e.g. SIERRA, TAO, SELECT, MODAL) develop statistical methods for data analysis, machine learning, and the treatment of large databases. Overall, the problems targeted in Platon are usually too costly, with high parametric dimension, and with few experimental data, so existing statistical methods can not be reused "as is", and require dedicated approaches.
On the application side, there are already Inria teams working on CFD applications, some even incorporating uncertainty quantification and sensitivity analysis activities. We mention here AIRSEA, which focuses on oceanic and atmospheric flows, CARDAMOM on free-surface hydraulics, and ACUMES on unsteady models in traffic flow and biology. In contrast to our project, all these efforts primarily address challenges in their respective application areas.
Scientific achievements
Our research activity features two main axes. The first is related to methodological developments, while the second is oriented to UQ problems with industrial interests.
The first contribution concerns a computer model calibration technique considering model error. Building upon our previous work 11, we formulate a new methodology called the Complete Maximum a posteriori method. Such improvement is embodied in two key properties. First and foremost, we have removed the point mass distribution hypothesis, upon which the KOH and FMP methods rely. Some examples and calculations show that such a hypothesis is never valid. The reformulated hypothesis is more general and less stringent; for this reason, we expect the method to provide a better approximation of the parameters' posterior distribution in a wider range of cases. The second improvement is the inclusion of the Hessian in the estimated parameters' posterior. In the case where the posterior has a single mode, both KOH and FMP will correctly identify such mode but tend to underestimate the weight of the tails, leading to a false certitude effect. We have also illustrated that the correction introduced by the CMP method becomes larger as the residuals grow, showing that this can lead to an undesired waterbed effect that assigns more mass to the peaks instead of the tails. When the posterior has multiple modes, the correction allows for recovering the correct weight of such modes.
The second contribution concerns the construction of globally accurate surrogates for bayesian inference 8. We present an iterative data-driven algorithm for the construction of polynomial chaos surrogates whose accuracy is localized in regions of high posterior probability. This method is applied to the identification of the source of an oil spill. Two synthetic oil spill experiments, in which the construction of prior-based surrogates is not feasible, are conducted to assess the performance of the proposed algorithm in estimating five source parameters. The algorithm successfully provided a good approximation of the posterior distribution and accelerated the estimation of the oil spill source parameters and their uncertainties by an order of 100 folds.
The third contribution 10 introduces a cost-efficient surrogate-based methodology employing an unequal allocation scheme to model a stochastic solver. This method is then applied to estimate material properties at a mesoscopic level using the PuMA Software from an uncertainty quantification perspective. An additional contribution of this work is the uncertainty propagation and sensitivity analysis of the material properties, which also yields a systematic assessment of the choice of the voxel resolution for both the fibers and the domain. Precisely, the convergence of the quantities of interest can be surveyed, thus identifying the minimal reference elementary volume.
The last contribution concerns the application of advanced UQ methodologies to a problem of interest in aerospace applications. In particular, we propose a holistic methodology 7 to determine the quantities of interest in an optimal manner for an under-expanded high-enthalpy jet, using both experimental measurements and high-fidelity flow simulations. Given the high computational cost of the high-fidelity simulations needed to describe the flow, we built an adaptive/multi-fidelity surrogate model to replace the estimation of the costly computer solver. A Bayesian inference method then allowed for characterizing an experiment carried out in the von Karman Institute's Plasmatron facility, for which no robust methodology currently exists. We show that the reservoir pressure and temperature and the nitrogen catalytic recombination coefficient of the copper probes can be accurately determined from the available measurements. Contrarily, the test conditions do not allow us to estimate the oxygen catalytic recombination coefficient. Finally, the characterized uncertainties are propagated through the numerical solver, yielding an uncertainty-based high-fidelity representation of the hypersonic flow's structure variability.
Collaborations
Since many years, we have several long-term partnerships with KAUST, von Karman Institute for Fluid-Dynamics (VKI), Politecnico di Milano and CEA.
With KAUST, we are working on new stochastic particle tracking methods to identify and track oil spills in open waters, combining satellite images and uncertainties in predicted currents. We also develop new assimilation schemes, inference methods for fractional diffusion models, and the selection and reduction of observations. There are several joint publications and exchanges of students.
With VKI, we work on UQ methods and inverse problems for atmospheric re-entry and ablation problems. In terms of production, there are several joint publications and one joint PhDs (M. Capriati).
With Politecnico di Milano, we have several activities in the Aeronautical and Energy fields. We work on the characterization of the thermodynamic model with Bayesian approaches, uncertainty on the turbulence model for RANS aerodynamic simulation, multi-fidelity approaches. We are currently involved in the EU TRACES project. We have also a strong collaboration with Giulio Gori, former member of Platon, and now Assistant Professor at Politecnico di Milano.
With CEA Saclay, we have a long-term collaboration since four years. Nicolas Leoni defended his thesis last year, and another student is doing her PhD (Sanae Idrissi).
External support
- MSCA Doctoral Network TRACES Project (2022-2026)
- Industrials contracts with CEA
- Industrial contract with 3DS
Self assessment
In addition to developing methods-oriented research, we proposed UQ methods tailored to specific applications in collaboration with other academic and industrial partners. This action has allowed us to position ourselves with high-impact papers in many application areas.
A weakness may be finding a balance between two different axes. The first axis concerns the development of high-level research from a methodological point of view, while the second one involves collaborations with industrial partners within research contracts and European projects. We think that the team's current size does not fit very well in the long term with this double effort. For this reason, the recruitment of new forces seems mandatory to keep sustaining a good balance between these two main axes of research.
8.2 Research axis 2: Solvers, Numerical Schemes and HPC
Personnel
Participants: P.M. Congedo, O. Le Maître, E. Denimal Goy, M. Duvillard, H. Dornier, H. Masson, C. Papagiannis.
Project-team positioning
Research on solvers, numerical schemes, and HPC algorithms specifically dedicated to UQ problems is scarce. Indeed, advanced sampling and stochastic estimation procedures, the subject of intensive outgoing research, rely on state-of-the-art deterministic solvers to generate the solution samples. To our knowledge, there is no research group (within or outside Inria) focusing entirely on the computational aspects of UQ problems. Groups producing computational utilities for UQ (e.g., Sandia's Dakota, OpenTurns) focus on the sampling part (statistical treatment), and the efficient generation of the samples is left to the user. In recent years, few works have concerned Galerkin solvers, their preconditioning, and the adaptation of domain decomposition methods (DDM) for (usually elliptic) stochastic PDEs. We can mention some activities in Manchester (preconditioning), Munich and Lausanne (DDM), and Bath (solvers for multi-level methods). In Platon, we are trying to exploit the structure of the stochastic problems to propose new strategies for their resolution (Galerkin method) or the generation of solution samples. These strategies can consist of adapting deterministic solvers to factorize the computational effort over multiple samples or, on the contrary, the definition of entirely new solution procedures to exploit parallel methods in stochastic problems better, beyond the independent resolution of independent samples. Our objective is to produce parallel and scalable methods for large-scale stochastic problems.
It becomes more and more critical to devise solution methods tailored to the stochastic problem when the numerical complexity of the underlying deterministic problem increases. For elliptic problems, highly efficient deterministic solvers' availability has somehow limited the research on stochastic solvers. The situation is different for models based on fractional diffusion operators (in space or time), where the numerical difficulties to solve these operators have virtually prevented any work on problems with stochastic fractional and diffusion coefficients. A few years ago, KAUST (Omar Knio) and KFUPM (Kassem Mustapha) initiated a research program on fractional diffusion models. Platon is involved in this program to deal with the stochastic extensions. Several new numerical schemes and algorithms to solve deterministic fractional diffusion equations have been designed. These schemes are suitable for an extension to stochastic problems (e.g., allowing for spatially variable coefficients and achieving efficient -scalability- enabling sampling methods and inverse problems).
Scientific achievements
When numerically solving partial differential equations, for a given problem and operating condition, adaptive mesh refinement (AMR) has proven its efficiency to automatically build a discretization achieving a prescribed accuracy at low cost. However, with continuously varying operating conditions, such as those encountered in uncertainty quantification, adapting a mesh for each evaluated condition becomes complex and computationally expensive. To enable more effective error and cost control, we introduce a novel approach to mesh adaptation 23. The method consists in building a unique adapted mesh that aims at minimizing the average error for a continuous set operating conditions. In the proposed implementation, this unique mesh is built iteratively, informed by an estimate of the local average error over a reduced set of sample conditions. The effectiveness and performance of the method are demonstrated on a one-dimensional Burgers equation and a two-dimensional Euler scramjet shocked flow configurations.
Secondly, a preconditioning strategy is proposed for the iterative solve of large numbers of linear systems with variable matrix and right-hand side which arise during the computation of solution statistics of stochastic elliptic partial differential equations with random variable coefficients sampled by Monte Carlo 26. Building on the assumption that a truncated Karhunen-Loeve expansion of a known transform of the random variable coefficient is known, we introduce a compact representation of the random coefficient in the form of a Voronoi quantizer. The number of Voronoi cells is set to the prescribed number of preconditioners. Upon sampling the random variable coefficient, the linear system assembled with a given realization of the coefficient is solved with the preconditioner whose centroidal variable coefficient is the closest to the realization. We consider different ways to define and obtain the centroidal variable coefficients, and we investigate the properties of the induced preconditioning strategies in terms of average number of solver iterations for sequential simulations, and of load balancing for parallel simulations. Another approach, which is based on deterministic grids on the system of stochastic coordinates of the truncated representation of the random variable coefficient, is proposed with a stochastic dimension which increases with the number P of preconditioners. This approach allows to bypass the need for preliminary computations in order to determine the optimal stochastic dimension of the truncated approximation of the random variable coefficient for a given number of preconditioners.
The third achievement focuses on accelerating numerical simulations when complex physical models need to be systematically evaluated 13. We focus on a hypersonic planetary reentry problem whose simulation involves coupling fluid dynamics and chemical reactions. Simulating chemical reactions takes most of the computational time but, on the other hand, cannot be avoided to obtain accurate predictions. We face a trade-off between cost-efficiency and accuracy: the numerical scheme has to be sufficiently efficient to be used in an operational context but accurate enough to predict the phenomenon faithfully. To tackle this trade-off, we design a hybrid numerical scheme coupling a traditional fluid dynamic solver with a neural network approximating the chemical reactions. We rely on their power in terms of accuracy and dimension reduction when applied in a big data context and on their efficiency stemming from their matrixvector structure to achieve important acceleration factors (×10 to ×18.6). This paper aims to explain how we design such cost-effective hybrid numerical schemes in practice. Above all, we describe methodologies to ensure accuracy guarantees, allowing us to go beyond traditional surrogate modeling and to use these schemes as references.
Another contributions concerns a comprehensive review of recent advancements in modelling approaches, design strategies, and testing techniques applied to friction damping in turbomachinery 14. It critically evaluates experimental testing, design processes, and optimisation studies, along with the latest developments in numerical modelling techniques. The review begins with an overview of vibration mitigation methods and the historical development of friction dampers for bladed disk systems. Subsequent sections explore research efforts aimed at enhancing numerical and simulation modelling capabilities, encompassing contact friction models, reduced-order modelling methods, and numerical solvers suitable for real-world applications and industrial high-fidelity models. The paper also delves into available testing rigs for experimental validation and characterisation of various friction damper types, as well as the literature on uncertainty quantification in friction damping. It concludes by highlighting recent trends in novel concepts, modelling techniques, and testing technologies shaping the design of next-generation friction dampers.
The final contribution concerns a novel approach to applying data assimilation techniques for particle-based simulations using the Ensemble Kalman Filter 24. While data assimilation methods have been effectively applied to Eulerian simulations, their application in Lagrangian solution discretizations has not been properly explored. We introduce two specific methodologies to address this gap. The first methodology employs an intermediary Eulerian transformation that combines a projection with a remeshing process. The second is a purely Lagrangian scheme designed for situations where remeshing is not appropriate. The second is a purely Lagrangian scheme that is applicable when remeshing is not adapted. These methods are evaluated using a one-dimensional advection-diffusion model with periodic boundaries. Performance benchmarks for the one-dimensional scenario are conducted against a grid-based assimilation filter Subsequently, assimilation schemes are applied to a non-linear two-dimensional incompressible flow problem, solved via the Vortex-In-Cell method. The results demonstrate the feasibility of applying these methods in more complex scenarios, highlighting their effectiveness in both the one-dimensional and two-dimensional contexts.
Collaborations
With KAUST, we worked with Omar Knio on numerical schemes for fractional diffusion equation and their extension to the stochastic case.
With CEA-CESTA, we worked on scientific machine learning techniques within the thesis of Paul Novello.
External support
- Industrial contracts with CEA
- Industrial contract with 3DS
- Industrial contract with Framatome
Self assessment
Concerning the fractional diffusion models, we are already engaged in the extension of the hierarchical matrix method to solve the spatial stochastic fractional diffusion equation with a Galerkin method and to develop sparse storage strategies to reduce the complexity of the stochastic time-fractional problem. These are promising and very original researches. We (Platon) are dependent on the collaboration to access some of the numerical utilities (H-matrices).
8.3 Research axis 3: Optimization under uncertainty
Personnel
Participants: P.M. Congedo, O. Le Maître, E. Denimal Goy, Z. Jones, H. Nicolas.
Project-team positioning
Optimization Under Uncertainty is an important axis of research, due to both the evergrowing computational power available and the need for efficiency, reliability and cost optimality. The presence of uncertainty could make the solution of a deterministic optimization problem suboptimal or even infeasible. Since this behavior could impact strongly the design performances, both academia and industry focused their effort to developing optimization under uncertanty methodologies. Optimization under uncertainty is a broad domain including several modeling paradigms, such as for example stochastic programming, Reliability-Based Design Optimization (RBDO, that deals with probabilistic and worst-case feasibility constraints), and Robust Design Optimization (RDO, where the deterministic objectives are replaced with averaged or worst-case ones, possibly in a multi-objective context such as the classical Taguchi optimization).
Note that most of the groups active in optimization under uncertainty also have strong activities in uncertainty quantification. Thus there is an overlap with the state of the art presented in Section 8.1. The Optimization & Uncertainty Quantification Group of Sandia-Albuquerque aim at providing advanced methods for the resolution of optimization under uncertainty problems. We mention as well optimization under uncertainties activities emerged in well-established groups working in specific application domains. We cite the Aerospace Computational Design Laboratory from MIT and the Institute for Computational Engineering and Sciences from University of Texas. In France, we can mention the OQUAIDO Chair ( Optimization and QUAntification of Uncertainties), hosted by the École des Mines de Saint-Étienne from 2016 to 2021, aiming to bring together academic and industrial partners to solve problems related to uncertainty quantification, inversion and optimization.
In the context of the Optimization under Uncertainty, Platon is devoted to developing novel methods to tackle constrained multi-objective optimization, with specific attention on cost-efficient and mainly derivative-free strategies. Specifically, we look for an optimal trade-off between computational cost and accuracy in the case of problems involving complex and expensive numerical solvers. Platon is exploring also dedicated representation and the design of computer experiments to obtain the best estimation at the lowest cost (or for a prescribed computational budget) for nontrivial goal, specifically optimization and reliability problems where the accuracy needed is not uniform, possibly unknown a priori and to be estimated as the construction proceeds. More recently, we have worked also on sample average approximation methods using a risk-averse stochastic programming formulations.
Several Inria Teams have the optimization problem as core activity, such as for example BONUS, EDGE, INOCS, POLARIS, RANDOPT. Main difference is that we are not interested in working on generic optimization algorithms, as mentioned before. In our past and current works, we use standard optimization algorithms, mainly for continuous optimization. We focus our attention on dedicated representations to efficiently estimate uncertainty-based metrics within an optimization problem. The Inria teams POLARIS and INOCS work on innovative methods for stochastic optimization that are quite different from those proposed by Platon.
Scientific achievements
The first contribution concerns a computational framework for controlling the location of isolated response curves, i.e. responses that are not connected to the main solution branch and form a closed curve in parameter space 12. The methodology relies on bifurcation tracking to follow the evolution of fold bifurcations in a codimension-2 parameter space. Singularity theory is used to distinguish points of isola formation and merger from codimension-2 bifurcations and an optimization problem is formulated to delay or advance the onset or merger of isolated response curves or control their position in the state/parameter space. We illustrate the methodology on three examples: a finite element model of a cantilever beam with cubic nonlinearity at its tip, a two-degree-of-freedom oscillator with asymmetry and a two-degree-of-freedom base-excited oscillator exhibiting multiple isolas. Our results show that the location of points of isola formation and mergers can effectively be controlled through structural optimization.
The second contribution illustrates a framework for the robust optimization of the heat flux distribution for an anti-ice electrothermal ice protection system under uncertain conditions 9. The considered uncertainty regards a lack of knowledge concerning the characteristics of the cloud, i.e., the liquid water content and the median volume diameter of water droplets, and the accuracy of measuring devices, i.e., the static temperature probe. Uncertain parameters are modeled as random variables, and two sets of bounds are investigated. A forward uncertainty propagation analysis is carried out using a Monte Carlo approach exploiting a surrogate models. The optimization framework relies on a gradient-free algorithm (mesh adaptive direct search), and two different objective functions are considered, namely, the 95 quantile of the freezing mass rate and the statistical frequency of the fully evaporative operating regime. The framework is applied to a reference test case, revealing a potential to improve the heat flux distribution of the baseline design. A new heat flux distribution is proposed, and it presents a more efficient use of the thermal power, increasing flight safety even at nonnominal environmental conditions.
Collaborations
We worked with DLR (German Aerospace Center) within the EU NEXTAIR project for robust optimization, and the thesis of Zachary Jones.
Several collaborations are with Politecnico di Milano within the TRACES project.
The collaboration with KAUST has brought specific stochastic optimization problems with structures that considerably differ from our other researches (e.g. two-stages optimization, introduction of recourse, discrete optimization, ...). These problems also involve different risk mitigation approaches. Working on these problems we have learn alternative formulations and uncertainty treatments that we plan to apply to engineering applications. Similarly, we have contributed with sampling and uncertainty modelling strategies that are original for this types of problems.
External support
- MSCA Doctoral Network TRACES Project (2022-2026)
- EU NEXTAIR Project (2022-2026)
- Industrial contract with Bañulsdesign
Self assessment
Concerning the strong point, we proposed advanced state-of-the-art methods in different aspects of optimization under uncertainty, which are topics of great interest in academia. At the same time, we consolidated industrial collaborations that have allowed us to develop high-impact projects with a relevant societal impact.
Concerning a potential weakness, we think it is particularly challenging, given the size of the team, to keep proposing innovative methods and, at the same time, to contribute to projects at the industrial and European scale. New recruitments seem necessary to ensure this twofold effort.
9 Bilateral contracts and grants with industry
Participants: P.M. Congedo, O. Le Maitre, M. Pocheau, M. Benmahdi, S. Idrissi, H. Khatouri.
9.1 Bilateral contracts with industry
9.1.1 3DS
Since 2022, the team benefits from a "contrat d'accompagnement" for the thesis of Meryem Benmahdi.
9.1.2 CEA
Since 2022, the team benefits from a "contrat d'accompagnement" for the thesis of Marius Duvillard, and from a "contrat d'accompagnement" for the thesis of Sanae Idrissi.
9.1.3 Framatome
Since 2023, the team benefits from a "research contrat" in the context of the Pré-Defi project with Framatome.
10 Partnerships and cooperations
10.1 International initiatives
10.1.1 Inria associate team not involved in an IIL or an international program
HYPATIE
-
Title:
Numerical Methods for the Design and Monitoring of Vibrations in Nonlinear Structures in the presence of Uncertainty
-
Duration:
2024 ->
-
Coordinator:
Enora Denimal Goy (enora.denimal-goy@inria.fr)
-
Partners:
- Imperial College London Londres (Royaume-Uni)
-
Inria contact:
Enora Denimal Goy
-
Summary:
Mechanical structures, such as aircraft, trains, space systems, etc., exhibit intrinsic non-linearities leading to complex dynamic phenomena that are rarely taken into account, in terms of design and monitoring, in industrial applications due to their complexity. Recent developments have made it possible to calculate or experimentally characterise these non-linear characteristics directly, opening up a wide range of possibilities for the design and monitoring of structures. However, these quantities are more sensitive to uncertainties. In this context, this project aims to develop a numerical framework for the robust design and monitoring of mechanical structures with non-linear dynamics. The new advanced algorithms to be developed will combine surrogate models, data-inferred stochastic modelling and uncertainty propagation through computer codes and experiments. The applications of interest will be non-linear dynamic mechanical structures of industrial interest, such as aero-engine blades.
10.2 International research visitors
10.2.1 Visits of international scientists
Other international visits to the team
Alberto Guardone
-
Status
Professor
-
Institution of origin:
Politecnico di Milano
-
Country:
Italy
-
Dates:
December 2024
-
Context of the visit:
Research Collaboration
-
Mobility program/type of mobility:
Research stay
Giulio Gori
-
Status
Assistant Professor
-
Institution of origin:
Politecnico di Milano
-
Country:
Italy
-
Dates:
December 2024
-
Context of the visit:
Research Collaboration
-
Mobility program/type of mobility:
Research stay
Omar Knio
-
Status
Professor
-
Institution of origin:
KAUST
-
Country:
SAUDI ARABIA
-
Dates:
September 2024
-
Context of the visit:
Research Collaboration
-
Mobility program/type of mobility:
Research stay
Thiago Ritto
-
Status
Professor
-
Institution of origin:
Universidade Federal do Rio de Janeiro (UFRJ)
-
Country:
Brazil
-
Dates:
October 2024
-
Context of the visit:
Research Collaboration
-
Mobility program/type of mobility:
Research stay
10.2.2 Visits to international teams
Research stays abroad
Enora Denimal Goy
-
Visited institution:
Imperial College London
-
Country:
United Kingdom
-
Dates:
July 2024
-
Context of the visit:
Research collaboration
-
Mobility program/type of mobility:
Research stay in the context of the Associate Team HYPATIE
Hugo Masson
-
Visited institution:
Imperial College London
-
Country:
United Kingdom
-
Dates:
December 2024
-
Context of the visit:
Research collaboration
-
Mobility program/type of mobility:
Research stay in the context of the Associate Team HYPATIE
10.3 European initiatives
10.3.1 Horizon Europe
NEXTAIR
NEXTAIR project on cordis.europa.eu
-
Title:
NEXTAIR - multi-disciplinary digital - enablers for NEXT-generation AIRcraft design and operations
-
Duration:
From September 1, 2022 to August 31, 2025
-
Partners:
- INSTITUT NATIONAL DE RECHERCHE EN INFORMATIQUE ET AUTOMATIQUE (INRIA), France
- THE UNIVERSITY OF SHEFFIELD (USFD), United Kingdom
- IMPERIAL COLLEGE OF SCIENCE TECHNOLOGY AND MEDICINE, United Kingdom
- AIRBUS OPERATIONS SAS (AIRBUS OPERATIONS), France
- ETHNICON METSOVION POLYTECHNION (NATIONAL TECHNICAL UNIVERSITY OF ATHENS - NTUA), Greece
- SAFRAN SA, France
- UNIVERSITA DEGLI STUDI DI CAGLIARI (UNICA), Italy
- OFFICE NATIONAL D'ETUDES ET DE RECHERCHES AEROSPATIALES (ONERA), France
- DEUTSCHES ZENTRUM FUR LUFT - UND RAUMFAHRT EV (DLR), Germany
- FUNDACION CENTRO DE TECNOLOGIAS DE INTERACCION VISUAL Y COMUNICACIONES VICOMTECH (VICOM), Spain
- DASSAULT AVIATION, France
- OPTIMAD ENGINEERING SRL (Optimad srl), Italy
- IRT ANTOINE DE SAINT EXUPERY, France
- ERDYN CONSULTANTS SAS, France
- ROLLS-ROYCE PLC, United Kingdom
-
Inria contact:
Pietro Congedo
-
Coordinator:
Marco Carini (ONERA)
-
Summary:
Radical changes in aircraft configurations and operations are required to meet the target of climate-neutral aviation. To foster this transformation, innovative digital methodologies are of utmost importance to enable the optimisation of aircraft performances.
NEXTAIR will develop and demonstrate innovative design methodologies, data-fusion techniques and smart health-assessment tools enabling the digital transformation of aircraft design, manufacturing and maintenance. NEXTAIR proposes digital enablers covering the whole aircraft life-cycle devoted to ease breakthrough technology maturation, their flawless entry into service and smart health assessment. They will be demonstrated in 8 industrial test cases, representative of multi-physics industrial design, maintenance problems and environmental challenges and interest for aircraft and engines manufacturers.
NEXTAIR will increase high-fidelity modelling and simulation capabilities to accelerate and derisk new disruptive configurations and breakthrough technologies design. NEXTAIR will also improve the efficiency of uncertainty quantification and robust optimisation techniques to effectively account for manufacturing uncertainty and operational variability in the industrial multi-disciplinary design of aircraft and engine components. Finally, NEXTAIR will extend the usability of machine learning-driven methodologies to contribute to aircraft and engine components' digital twinning for smart prototyping and maintenance.
NEXTAIR brings together 16 partners from 6 countries specialised in various disciplines: digital tools, advanced modelling and simulation, artificial intelligence, machine learning, aerospace design, and innovative manufacturing. The consortium includes 9 research organisations, 4 leading aeronautical industries providing digital-physical scaled demonstrator aircraft and engines and 2 high-Tech SME providing expertise in industrial scientific computing and data intelligence.
TRACES
TRACES project on cordis.europa.eu
-
Title:
TRAining the next generation of iCE researcherS
-
Duration:
From December 1, 2022 to November 30, 2026
-
Partners:
- ECOLE POLYTECHNIQUE (EP), France
- SAFRAN AEROSYSTEMS (SAFRAN AEROSYSTEMS SAS), France
- AIRBUS HELICOPTERS, France
- SAFRAN AIRCRAFT ENGINES, France
- INSTITUT POLYTECHNIQUE DE PARIS, France
- AIRBUS OPERATIONS SAS (AIRBUS OPERATIONS), France
- INSTITUT SUPERIEUR DE L'AERONAUTIQUE ET DE L'ESPACE (ISAE-Supaero), France
- AIRBUS DEFENCE AND SPACE GMBH, Germany
- TECHNISCHE UNIVERSITAET BRAUNSCHWEIG, Germany
- OFFICE NATIONAL D'ETUDES ET DE RECHERCHES AEROSPATIALES (ONERA), France
- ACCADEMIA EUROPEA DI BOLZANO (Eurac Research), Italy
- DASSAULT AVIATION, France
- POLITECNICO DI MILANO (POLIMI), Italy
- TECHNISCHE UNIVERSITAT DARMSTADT, Germany
- LEONARDO - SOCIETA PER AZIONI (LEONARDO), Italy
- GENERAL ELECTRIC DEUTSCHLAND HOLDING GMBH, Germany
-
Inria contact:
Pietro Congedo
-
Coordinator:
Alberto Guardone (Politecnico di Milano)
-
Summary:
In 2019, the European Aviation Safety Agency (EASA) identified in-flight icing as a priority 1 issue for large aeroplanes with the aggregated European Risk Classification Scheme score being amongst the highest of all safety issues. In-flight icing can occur when an aircraft flies through clouds of supercooled droplets, namely, drops of liquid water with a temperature below the freezing point, which freezes upon impact. Aircraft icing can lead to a reduction of visibility, damage due to ice shedding, blockage of probes and static vents, reduced flight performance, engine power loss, etc. In addition to safety concerns, inservice icing events can lead to major disruption of air operation and aircraft maintenance. The more frequent occurrence of severe thunderstorms due to climate change results in more in-flight accidents also at cruising altitudes, with more than 100 engine failures in recent years. Recently, icing-related issues are being observed in newer, more efficient aircraft engines due to the lower temperature of operation. The main goal of TRACES EJD is to provide high-level training in the field of inflight icing to deliver a new generation of high achieving Early Stage Researchers in the diverse disciplines necessary for mastering the complexity of ice accretion and its mitigation in aircraft and aeroengines. This goal will be achieved by a unique combination of hands-on research training, non-academic placements at major EU aviation industries and courses and workshops on scientific and complementary so-called soft skills facilitated by the academic/non-academic composition of the consortium. Innovative Ice Detection and Ice Protection Systems based on disruptive technologies will be designed by the ESRs during Project Working Group. EASA will provide training on certification procedure and together with major industries in the field will assess the ESRs projects during a team Design and Certify exercise.
10.4 National initiatives
10.4.1 ANR LabCom
MATritime:
Optimisation Robuste et Jumeaux Numériques pour la Transition Maritime - MATritime
-
Title:
Optimisation Robuste et Jumeaux Numériques pour la Transition Maritime - MATritime
-
Duration:
From April 1, 2023 to August 1, 2027
-
Partners:
- INSTITUT NATIONAL DE RECHERCHE EN INFORMATIQUE ET AUTOMATIQUE (INRIA), France
- Bañulsdesign, France
-
Inria contact:
Olivier Le Maître
-
Coordinator:
INRIA
-
Summary:
The maritime sector faces significant challenges: imposed reductions in the energy footprint of maritime transport, the advent of new modes of propulsion (sail, hydrogen), automation, and digitization ... At the same time, the numerical/digital revolution in naval design processes requires a great mastery of multiple complex domains specific to uncertain environments made up of the sea, the atmosphere, and their interface. New advanced procedures are needed to meet the challenges of a more sustainable, greener, and robust maritime industry. Meeting these challenges requires a considerable evolution of engineering practices with the establishment of dedicated processes in Computational Science and Engineering (CSE) based on advanced digital simulation technologies combining physical and statistical models. Indeed, even if the computing resources increase, the limitations of the physical models and the cost of high-fidelity approaches limit the simulations to a few nominal configurations. However, concentrating the simulation effort on a nominal system may be insufficient if the real-world system differs from the simulated one (due to manufacturing tolerances, random intrinsic effects, model error, poorly known environments, ...). In these situations, it is crucial to objectively quantify the uncertainties of the numerical predictions induced by the system's specification errors and model and to account for all these uncertainties during analyses and decision-making processes. This characterization makes it possible to design more robust systems reaching better levels of performance in actual conditions. The project proposes to develop a holistic approach to uncertainties by equipping numerical predictions with probability laws. Depending on the quality of the probabilistic representation, the computational overhead to estimate the prediction uncertainty can be very large. For example, Monte Carlo sampling methods require many simulations to estimate the variance of predictions, with a prohibitive cost when applied directly to detailed physical models. To overcome these limitations without renouncing precise physics, one has to resort to efficient approaches to produce probabilistic predictions at an acceptable cost. For this, we plan to develop methodologies closely associating physical and statistical modeling (e.g., multi-fidelity, multi-level Monte-Carlo, surrogate models, design of numerical experiment). All these methods, as opposed to purely statistical methods (such as Artificial Intelligence), incorporate physical simulations into the statistical processing producing the prediction; in return, they require a great deal of interaction with the experts of physical simulations to be developed. Our objective will be to deploy these numerical approaches and propose advanced uncertainty analyses, robust predictions, and design strategies for maritime applications. These complex applications will lead to developing research in robust multidisciplinary (approaches by subsystems) and multi-objectives design strategies to cover ship design, from component to system optimization. We will also set up a prototype of a ship's digital twin, integrating models and data to support the digitization of the maritime world and prepare future tools for operational issues (optimization of missions, routes, maintenance operations, ...).
10.4.2 ANR JCJC
MeMoRa:
-
Title:
Advanced surrogate modelling methods for early damage detection in uncertain nonlinear rotors based on antiresonances
-
Duration:
From April 1, 2024 to August 1, 2028
-
Partners:
- INSTITUT NATIONAL DE RECHERCHE EN INFORMATIQUE ET AUTOMATIQUE (INRIA), France
-
Inria contact:
Enora Denimal Goy
-
Coordinator:
INRIA
-
Summary:
Le projet MeMoRA a pour objectif de développer une méthodologie de maintenance prédictive de larges structures mécaniques vibrantes non-linéaires en exploitant leurs antirésonances pour détecter et localiser de façon précoce l’apparition de défauts tout en considérant les nombreuses incertitudes dans lesquelles la structure évolue. Afin de rendre numériquement possible une telle approche, une stratégie de méta-modélisation et de construction de plans d’expériences enrichis par la physique seront développés afin de reconstruire le comportement spatial et fréquentiel des antirésonances des structures mécaniques incertaines. Les fonctions de réponses en fréquences (FRF) de premier ordre, et d’ordre supérieur dans le cas non-linéaire, pourront alors être entièrement reconstruites en tout point de la structure en considérant les incertitudes de modélisation. Ainsi, MeMoRA permettra de définir de nouveaux indicateurs de santé robustes basés sur les antirésonances, très sensibles aux défauts contrairement aux indicateurs classiques, afin de détecter et localiser l’apparition précoce de défauts en contexte incertain. Ces nouveaux indicateurs et les méthodes précédemment développées seront couplés à une stratégie d’optimisation robuste de placement de capteurs pour maximiser la détectabilité des défauts dans un contexte incertain. Le projet se focalisera sur le cas des rotors avec une fissure respirante, et sera amené à s’étendre à des structures complexes du monde industriel (turbomachines, éoliennes, ponts).
10.4.3 France 2030
MEDITWIN:
-
Title:
Jumeau virtuel pour le futur du soin médical - MEDITWIN
-
Duration:
From September 1, 2024 to August 1, 2029
-
Partners:
- INSTITUT NATIONAL DE RECHERCHE EN INFORMATIQUE ET AUTOMATIQUE (INRIA), France
- Dassault Systèmes, France
-
Inria contact:
Pietro Marco Congedo
-
Coordinator:
INRIA
-
Summary:
The MEDITWIN project leverages the expertise of world-class partners in each of the fields it covers, bringing together the 14 founding members of the consortium: Dassault Systèmes, serving as the industrial leader of the consortium; seven University Hospital Institutes (IHUs) recognized for their medical and scientific excellence; the Nantes University Hospital via the Thorax Institute; startups inHEART, Codoc, Qairnel, and Neurometers; and Inria through 11 project teams actively engaged in the initiative.
Digital twins have become indispensable in the aerospace and mobility industries, where virtualization has led to significant advances in safety, quality, ecological impact, and economic competitiveness. MEDITWIN builds on the extensive experience of its partners in the domain of digital twins for healthcare, including Dassault Systèmes' Living Heart initiative, the Living Brain initiative, and projects within the Digital Health PEPR co-led by Inria and INSERM, to name a few examples.
MEDITWIN aims to industrialize, clinically validate, and standardize these initiatives so that these technologies can be deployed in a standardized manner and benefit the widest possible audience. The best standards of care will thus be codified into virtualized experiences, accessible worldwide, creating a new benchmark for healthcare quality and a pivotal learning platform for advancing medical science.
The benefits of digital twins will be assessed at the levels of medical teams, patients, and the healthcare system, focusing on improvements in care efficiency, the quality of multidisciplinary decision-making, and the efficacy and safety of medical practices and interventions.
PREMYOM:
-
Title:
Prise en charge et Ralentissement de l’Épidémie de Myopie par l’Optique Médicale - PREMYOM
-
Duration:
From September 1, 2024 to August 1, 2029
-
Partners:
- INSTITUT NATIONAL DE RECHERCHE EN INFORMATIQUE ET AUTOMATIQUE (INRIA), France
- EssilorLuxottica, France
-
Inria contact:
Pietro Marco Congedo
-
Coordinator:
INRIA
-
Summary:
PREMYOM is a multidisciplinary consortium of 6 well-known partners from industry, healthcare and research, coordinated by EssilorLuxottica, bringing an unprecedented blend of technical, clinical, and digital expertise: Hôpital Fondation Adolphe de Rothschild, INRIA, InSimo, Institut Mines-Télécom, and Institut de la Vision.
The project has been selected for co-funding by the French Prime Minister's Secrétariat Général Pour l’Investissement (SGPI) and its operating agency Bpifrance as part of the France 2030 plan and i-Demo-2 State funding, highlighting the critical importance of addressing children's visual health as a major public health issue.
PREMYOM aims to slow the myopia epidemic by personalizing myopia control lenses and the care pathway for young myopic patients. The consortium will deliver a model of myopia progression by elucidating retinal mechanisms and leveraging real-world data from a unique cohort in Europe. A digital twin of the myopic eye, combined with innovative technologies for lenses, e-frames, and instruments, will provide an optimal individual solution, tested for both efficacy and user quality.
10.5 Public policy support
-
Title:
Action Exploratoire NO-BIF
-
Duration:
From December 1, 2022 to December 1, 2024
-
Partners:
- INSTITUT NATIONAL DE RECHERCHE EN INFORMATIQUE ET AUTOMATIQUE (INRIA), France
- Imperial College London, UK
-
Inria contact:
Enora Denimal Goy
-
Coordinator:
Enora Denimal Goy
-
Summary:
Les méthodes de conception des structures mécaniques (aéronautique, spatial, transport, production d’énergie, etc.) sont en train de changer drastiquement. De nouvelles technologies telles que l’impression 3D permettent de rendre ces dernières plus légères et plus performantes. La présence intrinsèque de non-linéarités pouvant mener à l’apparition de bifurcations rend l’optimisation de ces structures mécaniques complexe. Ces bifurcations peuvent avoir des conséquences dramatiques (crash d’avion) ou être exploitées dans d’autres contextes (capteurs MEMS). L’objectif du projet NO-BIF est de développer des méthodes numériques afin de supprimer (ou contrôler) ces phénomènes bifurcatoires pour des systèmes mécaniques industriels (transport, énergie, spatial etc.) via de l’optimisation topologique (optimisation de la répartition de matière) en intégrant des méthodes types IA (méta-modélisation, apprentissage statistique) et des méthodes de réduction de modèles.
11 Dissemination
Participants: P.M. Congedo, O. Le Maître, E. Denimal Goy.
11.1 Promoting scientific activities
11.1.1 Scientific events: organisation
- Enora Denimal Goy was in the organisation and scientific committee of the GDR EX-MODELI 2024 workshop.
- Pietro Marco Congedo and Enora Denimal Goy have been the organizers and chairmen of the Winter of Codes, TRACES EU Project, Inria Center of Saclay, December 9-14 2024.
11.1.2 Scientific events: selection
Member of the conference program committees
- Enora Denimal Goy and Pietro Marco Congedo have co-organised a Mini-Symposium at the conference ECCOMAS2024.
- Olivier Le Maître is member of the scientific committee of the UNCECOMP conference.
11.1.3 Journal
Member of the editorial boards
- Olivier Le Maître is member of the editorial board of the International Journal for Uncertainty Quantification.
- Pietro Marco Congedo is Editor of the Journal "Mathematics and Computer in Simulation (MATCOM)" from Elsevier.
Reviewer - reviewing activities
- Enora Denimal Goy was a reviewer for the following peer-reviewed journals: Journal of Engineering for Gas Turbines and Power, Mechanical Systems and Signal Processing (2), Mathematics and Computers in Simulation, Archives of Aplied mechanics, European Journal of Mechanics / A Solids, Experimental Mechanics.
- Olivier Le Maître was reviewer for the Journal of Computational physics, Computer Methods in Applied Mechanics and Engineering, Probabilistic Mechanics, Computers and Fluids,International Journal for Uncertainty Quantification, Journal of Scientific Computing,...
11.1.4 Invited talks
- Enora Denimal Goy has given an invited talk at the CSMA Junior 2024 workshop, Porquerolles, the 17th of May 2024.
- Pietro Marco Congedo has given an invited talk at the Opening Training School, TRACES EU Project, Politecnico di Milano, January 18 2024.
- Pietro Marco Congedo has given an invited talk at the Second Training School, TRACES EU Project, TU Braunschweig, September 25 2024.
- Pietro Marco Congedo has given an invited talk at the Advanced Computational Fluid Dynamics Methods for Hypersonic Flows - VKI LS and STO LS AVT-358, von Karman Institut for Fluid Dynamics, March 25 2024.
- Pietro Marco Congedo has given an invited talk (Plenary) at the NICFD conference, Ecole Centrale de Lyon, October 31 2024.
- Pietro Marco Congedo has given an invited talk at the "Journées Scientifiques 2024 du réseau thématique RT Terre et Energies", Nouan-le-Fuzelier, November 4-8 2024.
- Olivier Le Maitre gave an invited talk at the International workshop on Practical Data Assimilation and Uncertainty Quantification (PAUQ), Orleans, June 2024.
11.1.5 Leadership within the scientific community
- Enora Denimal Goy is a board member of the French GDR EX-MODELI.
- Enora Denimal Goy is an elected board member of the CSMA Junior (French Computational Structural Mechanics Association)
11.1.6 Research administration
- Pietro Marco Congedo is the Scientific Director of the Inria International Lab CWI-Inria.
- Olivier Le Maître is the Scientific Director of the MATritime Labcom.
- Enora Denimal Goy is the head of the Associate Team HYPATIE in collaboration with Imperial College London.
- Enora Denimal Goy is the PI of the ANR JCJC MeMoRa.
- Enora Denimal Goy was the PI of the Exploratory Action NO-BIF.
11.2 Teaching - Supervision - Juries
11.2.1 Teaching
- E. Denimal Goy, 2024: INSA Rennes, Graduate Level (10h/y), academic advisor of an apprentice.
- O. Le Maitre, 2024: Doctoral School SMEMAG, Graduate Level (22h/y), Course on Uncertainty Quantification Methods.
- O. Le Maître, 2024: Ecole Polytechnique, Dept. Applied Math, last year Engineering Degree, PC on Uncertainty and Risk Management (22h/y).
11.2.2 Supervision
- Pietro Marco Congedo has been the co-advisor of the thesis of Michele Capriati in collaboration with von Karman Institute for Fluid-dynamics (Belgium), defended the January 24 2024.
- Olivier Le Maître is the co-advisor of the thesis of Marius Duvillard in collaboration with CEA Cadarache.
- Olivier Le Maître is the co-advisor of the thesis of Nadège Polette in collaboration with CEA DAM.
- Pietro Marco Congedo and Olivier Le Maître are advisors of the thesis of Meryem Benmahdi, in collaboration with 3DS.
- Pietro Marco Congedo and Olivier Le Maître are advisors of the thesis of Sanae Idrissi Janati, in collaboration with CEA Saclay.
- Pietro Marco Congedo and Olivier Le Maître are advisors of the thesis of Zachary Jones.
- Pietro Marco Congedo and Olivier Le Maître are advisors of the thesis of Christos Papagiannis, in collaboration with LEGI Lab.
- Pietro Marco Congedo and Olivier Le Maître are advisors of the thesis of Hugo Dornier, in collaboration with ONERA.
- Pietro Marco Congedo and Olivier Le Maître are advisors of the thesis of Hugo Nicolas, in collaboration with Bañulsesign (Projet Matritime).
- Pietro Marco Congedo, Olivier Le Maître and Enora Denimal Goy are advisors of the thesis of Omar Kahol.
- Enora Denimal Goy is the co-advisor of the thesis of Hugo Masson, in collaboration with Ecole des Ponts.
- Enora Denimal Goy is the co-advisor of the thesis of Nina Delette, in collaboration with IFPEN.
- Enora Denimal Goy is the co-advisor of the thesis of Erwan Dehillerin, in collaboration with Ecole Centrale de Lyon.
- Enora Denimal Goy has been the co-supervisor of the postdoc Adrien Mélot, in collaboration with Imperial College London.
- Enora Denimal Goy has been the co-supervisor of the postdoc Vincent Mahé, in collaboration with I4S (Inria Rennes).
- Enora Denimal Goy has been the co-supervisor of the MSc thesis of Diana Stancic, in collaboration with Imperial College London.
- Pietro Marco Congedo, Olivier Le Maître and Enora Denimal Goy are the supervisors of Hanane Khatouri, research engineer in the context of the Pré-DEFI project with Framatome.
11.2.3 Juries
- Enora Denimal Goy served as an examiner for the PhD defense of Lisa Fournier at ISAE-Supaero in December 2024.
- Pietro Marco Congedo served as a reviewer for the PhD defense of Piero Favaretti at Università di Trieste in April 2024.
- Pietro Marco Congedo served as a reviewer for the HdR defense of Maria Adela Puscas at Sorbonne Université in June 2024.
- Pietro Marco Congedo served as an examiner for the HdR defense of Maria Giovanna Rodio at Institut Polytechnique de Paris in June 2024.
- Pietro Marco Congedo served as the President of the Jury for the PhD defense of Paul Lartaud at the Institut Polytechnique de Paris in October 2024.
- Pietro Marco Congedo served as the President of the Jury for the PhD defense of Camille Matar at Sorbonne Université in October 2024.
- Pietro Marco Congedo served as the President of the Jury for the PhD defense of Nathalie Nouaime at Sorbonne Université in December 2024.
- Oliver Le Maître served as reviewer and Chair of the Jury for the PhD defence of Jérémy Briant at INP-Toulouse in November 2024.
- Olivier Le Maitre served as president of the Jury for the Habilitation thesis of Frédéric Joly at Université Paris-Saclay in December 2024.
- Olivier Le Maître served as examiner for the PhD defence of Adrien Béguinet at Université Paris-Saclay.
11.2.4 Internal or external Inria responsabilities
- Enora Denimal Goy is an elected member of the "Comité de Centre" of the Inria Saclay research center.
- Enora Denimal Goy is a member of the Inria national committee on gender equality.
- Pietro Marco Congedo is a member of the BCEP (Bureau du comité des équipes-projets) of the Inria Center of Saclay.
- Pietro Marco Congedo participated in the Admission Jury for the DR2 Inria selection, June 6 2024.
- Pietro Marco Congedo coordinated the working group for the creation of the BOOST Inria team (Inria Center of Saclay).
- Pietro Marco Congedo is the coordinator of the Committee for the sustainable development of the Center Inria Saclay Île-de-France.
11.3 Popularization
11.3.1 Specific official responsibilities in science outreach structures
- Pietro Marco Congedo is the Coordinator of "Maths/Engineering" Program of the Labex Mathématiques Hadamard (IPP and Paris-Saclay University), since 2022.
- Olivier Le Maître is a member of the Conseil du Laboratoire du CMAP (Ecole Polytechnique, IPP).
- Olivier Le Maître is the Adjunct Director of the Ecole Doctorale de Mathématiques Hadamard (EDMH).
- Olivier Le Maître is member of math committee of the PhD Track Program of IP-Paris.
- Pietro Marco Congedo is the coordinator of the "Pôle Analyse" of CMAP Lab (Ecole Polytechnique, IPP).
- Olivier Le Maître is the corresponding member of the Inria SIF center with the French Agency for Math and Industry (AMIES).
12 Scientific production
12.1 Major publications
- 1 articleHolistic characterization of an under-expanded high-enthalpy jet under uncertainty.Physics of Fluids366June 2024HALDOI
- 2 articleIterative data-driven construction of surrogates for an efficient Bayesian identification of oil spill source parameters from image contours.Computational Geosciences284May 2024, 681-696HALDOI
- 3 articleBAYESIAN CALIBRATION WITH ADAPTIVE MODEL DISCREPANCY.International Journal for Uncertainty Quantification1412024, 19-41HALDOI
- 4 articleControl of isolated response curves through optimization of codimension-1 singularities.Computers & StructuresApril 2024, 1-19HALDOI
- 5 articleAccelerating hypersonic reentry simulations using deep learning-based hybridization (with guarantees).Journal of Computational Physics498February 2024, 112700HALDOI
- 6 articleFriction damping for turbomachinery: A comprehensive review of modelling, design strategies, and testing capabilities.Progress in Aerospace Sciences147May 2024, 101018HALDOI
12.2 Publications of the year
International journals
- 7 articleHolistic characterization of an under-expanded high-enthalpy jet under uncertainty.Physics of Fluids366June 2024HALDOIback to text
- 8 articleIterative data-driven construction of surrogates for an efficient Bayesian identification of oil spill source parameters from image contours.Computational Geosciences284May 2024, 681-696HALDOIback to text
- 9 articleRobust Optimization of a Thermal Anti-Ice Protection System in Uncertain Cloud Conditions.Journal of Aircraft611January 2024, 1-15HALDOIback to text
- 10 articleStochastic mesoscale characterization of ablative materials for atmospheric entry.Applied Mathematical Modelling135November 2024, 745-758HALDOIback to text
- 11 articleBAYESIAN CALIBRATION WITH ADAPTIVE MODEL DISCREPANCY.International Journal for Uncertainty Quantification1412024, 19-41HALDOIback to text
- 12 articleControl of isolated response curves through optimization of codimension-1 singularities.Computers & StructuresApril 2024, 1-19HALDOIback to text
- 13 articleAccelerating hypersonic reentry simulations using deep learning-based hybridization (with guarantees).Journal of Computational Physics498February 2024, 112700HALDOIback to text
- 14 articleFriction damping for turbomachinery: A comprehensive review of modelling, design strategies, and testing capabilities.Progress in Aerospace Sciences147May 2024, 101018HALDOIback to text
International peer-reviewed conferences
- 15 inproceedingsArtificial Neural Networks for UQ and calibration of one-dimensional arterial hemodynamics.ECCOMAS 2024 - The 9th European Congress on Computational Methods in Applied Sciences and EngineeringLisbon, PortugalJune 2024HAL
- 16 inproceedingsMean adaptive mesh refinement for efficient CFD simulations with operating conditions variability.European Congress on Computational Methods in Applied Sciences and Engineering (ECCOMAS 2024)Lisbonne, PortugalJune 2024HAL
- 17 inproceedingsSampling and Estimating the Set of Pareto Optimal Solutions in Stochastic Multi-Objective Optimization.ECCOMAS 2024 - The 9th European Congress on Computational Methods in Applied Sciences and EngineeringLisbon, PortugalJune 2024HAL
- 18 inproceedingsNonlinear system identification with control-based continuation of bifurcation curves.ENOC 2024 - 11th European Nonlinear Dynamics ConferenceDelft, NetherlandsJuly 2024, 1-2HAL
- 19 inproceedingsOn the use of bifurcation curves for system identification and model updating purposes.ECCOMAS 2024 - 9th European Congress on Computational Methods in Applied Sciences and EngineeringLisbon, PortugalJune 2024, 1-1HAL
- 20 inproceedingsStructural optimization for controlling isolated response curves.ENOC 2024 - 11th European Nonlinear Dynamics ConferenceDelft, NetherlandsJuly 2024, 1-3HAL
National peer-reviewed Conferences
- 21 inproceedingsContrôle de courbes de réponses isolées par optimisation structurelle.16ème Colloque National en Calcul de Structures (CSMA 2024)Hyères, France2024HAL
Conferences without proceedings
- 22 inproceedings How sailor morphology affects Olympic windfoil performance? Sports Physics 2024 Rennes, France December 2024 HAL
Reports & preprints
- 23 miscMean Mesh Adaptation for Efficient CFD Simulations with Operating Conditions Variability.November 2024HALback to text
- 24 miscEnsemble Data Assimilation for Particle-based Methods.October 2024HALback to text
- 25 miscChange of Measure for Bayesian Field Inversion with Hierarchical Hyperparameters Sampling.2024HALDOI
- 26 miscPreconditioners based on Voronoi quantizers of random variable coefficients for stochastic elliptic partial differential equations.March 2024HALback to text
Other scientific publications
- 27 inproceedingsRobust topology optimization accounting for uncertain micro-structural changes.Conférence pour les 50 ans du CMAPPalaiseau, FranceSeptember 2024HAL