2023Activity reportProjectTeamAIRSEA
RNSR: 201521159N Research center Inria Centre at Université Grenoble Alpes
 In partnership with:Université de Grenoble Alpes, CNRS
 Team name: Mathematics and computing applied to oceanic and atmospheric flows
 In collaboration with:Laboratoire Jean Kuntzmann (LJK)
 Domain:Digital Health, Biology and Earth
 Theme:Earth, Environmental and Energy Sciences
Keywords
Computer Science and Digital Science
 A3.1.8. Big data (production, storage, transfer)
 A3.4.1. Supervised learning
 A3.4.2. Unsupervised learning
 A3.4.5. Bayesian methods
 A3.4.6. Neural networks
 A3.4.7. Kernel methods
 A3.4.8. Deep learning
 A6.1.1. Continuous Modeling (PDE, ODE)
 A6.1.2. Stochastic Modeling
 A6.1.4. Multiscale modeling
 A6.1.5. Multiphysics modeling
 A6.2.1. Numerical analysis of PDE and ODE
 A6.2.4. Statistical methods
 A6.2.6. Optimization
 A6.2.7. High performance computing
 A6.3.1. Inverse problems
 A6.3.2. Data assimilation
 A6.3.4. Model reduction
 A6.3.5. Uncertainty Quantification
 A6.4.6. Optimal control
 A6.5.2. Fluid mechanics
 A6.5.4. Waves
Other Research Topics and Application Domains
 B3.2. Climate and meteorology
 B3.3.2. Water: sea & ocean, lake & river
 B3.3.4. Atmosphere
 B3.4.1. Natural risks
 B4.3.2. Hydroenergy
 B4.3.3. Wind energy
 B9.11.1. Environmental risks
1 Team members, visitors, external collaborators
Research Scientists
 Arthur Vidard [Team leader, INRIA, Researcher, from Jul 2023, HDR]
 Laurent Debreu [Team leader, INRIA, Senior Researcher, until Jun 2023, HDR]
 Eugene Kazantsev [INRIA, Researcher]
 Florian Lemarie [INRIA, Researcher]
 Gurvan Madec [CNRS, Researcher, HDR]
 Clémentine Prieur [INRIA, until Aug 2023, HDR]
 Olivier Zahm [INRIA, Researcher]
Faculty Members
 Elise Arnaud [UGA, Associate Professor]
 Éric Blayo [UGA, Professor, HDR]
 Christine Kazantsev [UGA, Associate Professor]
 Clémentine Prieur [UGA, Professor, from Sep 2023, HDR]
 Martin Schreiber [UGA, Associate Professor]
PostDoctoral Fellows
 Alexis Anagostakis [UGA, PostDoctoral Fellow]
 Valentin Breaz [INRIA, PostDoctoral Fellow, from Apr 2023]
 Hugo Brunie [UGA, PostDoctoral Fellow, from Nov 2023]
 Simon Clement [INRIA, PostDoctoral Fellow, from Mar 2023]
 Sanal Parameswaran [INRIA, PostDoctoral Fellow, until Apr 2023]
 Victor Trappler [INRIA, PostDoctoral Fellow, until Nov 2023]
PhD Students
 Adama Barry [IFPEN, CIFRE]
 Rishabh Bhatt [UGA, from Oct 2023]
 Rishabh Bhatt [INRIA, until Sep 2023]
 Qiao Chen [UGA]
 Gabriel Derrida [INRIA, from Oct 2023]
 Clément Duhamel [INRIA]
 Helene Henon [INRIA, from Nov 2023]
 Pierre Lozano [UGA]
 Exauce Luweh Adjim Ngarti [BULL, CIFRE, from Mar 2023]
 AntoineAlexis Nasser [UGA, from Oct 2023 until Nov 2023]
 AntoineAlexis Nasser [CNRS, from Feb 2023 until Sep 2023]
 AntoineAlexis Nasser [INRIA / UGA / CNRS, until Jan 2023]
 Manolis Perrot [UGA]
 Katarina Radisic [INRAE]
 Emilie Rouzies [INRIA, until Feb 2023]
 Angelique Saillet [UGA, from Oct 2023]
 Robin Vaudry [CNRS]
 Romain Verdiere [INRIA]
 Ri Wang [CSC Scholarship]
 Benjamin Zanger [INRIA]
Technical Staff
 Céline Acary Robert [UGA, Engineer]
 Valentin Breaz [INRIA, Engineer, until Mar 2023]
 Maurice Brémond [Inria, Engineer]
 Gabriel Derrida [INRIA, Engineer, until Apr 2023]
 Hugues Lascombes De Laroussilhe [INRIA, Engineer, until Oct 2023]
 Sebastien Valat [INRIA, Engineer]
Interns and Apprentices
 Cheikh Gaye Coundoul [INRIA, Intern, from May 2023 until Jun 2023]
 Gabriel Derrida [INRIA, from May 2023 until Sep 2023]
 Helene Henon [INRIA, from Oct 2023 until Oct 2023]
 Gabriel Mouttapa [INRIA, Intern, from Apr 2023 until Oct 2023]
 Aime Tresor Ndayongeje [INRIA, Intern, from May 2023 until Jun 2023]
 Emma Ninucci [INRIA, Intern, from Mar 2023 until Jul 2023]
 Julien Remy [INRIA, Intern, from Jun 2023 until Sep 2023]
 Valentina Schuller [INRIA, Intern, until Apr 2023]
 Karen Tonini Dos Santos [UGA, from Feb 2023 until Jul 2023]
Administrative Assistant
 Luce Coelho [INRIA, from Sep 2023]
Visiting Scientists
 Joao Caldas Steinstraesser [UNIV SAO PAULO, from Mar 2023 until Mar 2023]
 Arthur Campos [Univ. Sao Paulo, from Sep 2023]
 Tiangang Cui [UNIV MONASH, from Jun 2023 until Aug 2023]
 Jose Rafael Leon Ramos [Universidad Central de Venezuela, until Jan 2023]
 SergiEnric Siso Godia [HARTREE CENTRE STFC, from Feb 2023 until Feb 2023]
External Collaborators
 Rémi Druilhe [ATOS, until Nov 2023]
 Lionel Vincent [ATOS, until Nov 2023]
 François Wellenreiter [BULL, until Nov 2023]
2 Overall objectives
The general scope of the AIRSEA projectteam is to develop mathematical and computational methods for the modeling of oceanic and atmospheric flows. The mathematical tools used involve both deterministic and statistical approaches. The main research topics cover a) modeling and coupling b) model reduction for sensitivity analysis, coupling and multiscale optimizations c) sensitivity analysis, parameter estimation and risk assessment d) algorithms for high performance computing. The range of application is from climate modeling to the prediction of extreme events.
3 Research program
Recent events have raised questions regarding the social and economic implications of anthropic alterations of the Earth system, i.e. climate change and the associated risks of increasing extreme events. Ocean and atmosphere, coupled with other components (continent and ice) are the building blocks of the Earth system. A better understanding of the ocean atmosphere system is a key ingredient for improving prediction of such events. Numerical models are essential tools to understand processes, and simulate and forecast events at various space and time scales. Geophysical flows generally have a number of characteristics that make it difficult to model them. This justifies the development of specifically adapted mathematical methods:
 Geophysical flows are strongly nonlinear. Therefore, they exhibit interactions between different scales, and unresolved small scales (smaller than mesh size) of the flows have to be parameterized in the equations.
 Geophysical fluids are non closed systems. They are openended in their scope for including and dynamically coupling different physical processes (e.g., atmosphere, ocean, continental water, etc). Coupling algorithms are thus of primary importance to account for potentially significant feedback.
 Numerical models contain parameters which cannot be estimated accurately either because they are difficult to measure or because they represent some poorly known subgrid phenomena. There is thus a need for dealing with uncertainties. This is further complicated by the turbulent nature of geophysical fluids.
 The computational cost of geophysical flow simulations is huge, thus requiring the use of reduced models, multiscale methods and the design of algorithms ready for high performance computing platforms.
Our scientific objectives are divided into four major points. The first objective focuses on developing advanced mathematical methods for both the ocean and atmosphere, and the coupling of these two components. The second objective is to investigate the derivation and use of model reduction to face problems associated with the numerical cost of our applications. The third objective is directed toward the management of uncertainty in numerical simulations. The last objective deals with efficient numerical algorithms for new computing platforms. As mentioned above, the targeted applications cover oceanic and atmospheric modeling and related extreme events using a hierarchy of models of increasing complexity.
3.1 Modeling for oceanic and atmospheric flows
Current numerical oceanic and atmospheric models suffer from a number of wellidentified problems. These problems are mainly related to lack of horizontal and vertical resolution, thus requiring the parameterization of unresolved (subgrid scale) processes and control of discretization errors in order to fulfill criteria related to the particular underlying physics of rotating and strongly stratified flows. Oceanic and atmospheric coupled models are increasingly used in a wide range of applications from global to regional scales. Assessment of the reliability of those coupled models is an emerging topic as the spread among the solutions of existing models (e.g., for climate change predictions) has not been reduced with the new generation models when compared to the older ones.
Advanced methods for modeling 3D rotating and stratified flows The continuous increase of computational power and the resulting finer grid resolutions have triggered a recent regain of interest in numerical methods and their relation to physical processes. Going beyond present knowledge requires a better understanding of numerical dispersion/dissipation ranges and their connection to model fine scales. Removing the leading order truncation error of numerical schemes is thus an active topic of research and each mathematical tool has to adapt to the characteristics of three dimensional stratified and rotating flows. Studying the link between discretization errors and subgrid scale parameterizations is also arguably one of the main challenges.
Complexity of the geometry, boundary layers, strong stratification and lack of resolution are the main sources of discretization errors in the numerical simulation of geophysical flows. This emphasizes the importance of the definition of the computational grids (and coordinate systems) both in horizontal and vertical directions, and the necessity of truly multi resolution approaches. At the same time, the role of the small scale dynamics on large scale circulation has to be taken into account. Such parameterizations may be of deterministic as well as stochastic nature and both approaches are taken by the AIRSEA team. The design of numerical schemes consistent with the parameterizations is also arguably one of the main challenges for the coming years. This work is complementary and linked to that on parameters estimation described in 3.3.
Ocean Atmosphere interactions and formulation of coupled models Stateoftheart climate models (CMs) are complex systems under continuous development. A fundamental aspect of climate modeling is the representation of airsea interactions. This covers a large range of issues: parameterizations of atmospheric and oceanic boundary layers, estimation of airsea fluxes, timespace numerical schemes, non conforming grids, coupling algorithms ...Many developments related to these different aspects were performed over the last 1015 years, but were in general conducted independently of each other.
The aim of our work is to revisit and enrich several aspects of the representation of airsea interactions in CMs, paying special attention to their overall consistency with appropriate mathematical tools. We intend to work consistently on the physics and numerics. Using the theoretical framework of globalintime Schwarz methods, our aim is to analyze the mathematical formulation of the parameterizations in a coupling perspective. From this study, we expect improved predictability in coupled models (this aspect will be studied using techniques described in 3.3). Complementary work on spacetime nonconformities and acceleration of convergence of Schwarzlike iterative methods (see 8.1.2) are also conducted.
3.2 Model reduction / multiscale algorithms
The high computational cost of the applications is a common and major concern to have in mind when deriving new methodological approaches. This cost increases dramatically with the use of sensitivity analysis or parameter estimation methods, and more generally with methods that require a potentially large number of model integrations.
A dimension reduction, using either stochastic or deterministic methods, is a way to reduce significantly the number of degrees of freedom, and therefore the calculation time, of a numerical model.
Model reduction Reduction methods can be deterministic (proper orthogonal decomposition, other reduced bases) or stochastic (polynomial chaos, Gaussian processes, kriging), and both fields of research are very active. Choosing one method over another strongly depends on the targeted application, which can be as varied as realtime computation, sensitivity analysis (see e.g., section 8.4) or optimisation for parameter estimation (see below).
Our goals are multiple, but they share a common need for certified error bounds on the output. Our team has a 4year history of working on certified reduction methods and has a unique positioning at the interface between deterministic and stochastic approaches. Thus, it seems interesting to conduct a thorough comparison of the two alternatives in the context of sensitivity analysis. Efforts will also be directed toward the development of efficient greedy algorithms for the reduction, and the derivation of goaloriented sharp error bounds for non linear models and/or non linear outputs of interest. This will be complementary to our work on the deterministic reduction of parametrized viscous Burgers and Shallow Water equations where the objective is to obtain sharp error bounds to provide confidence intervals for the estimation of sensitivity indices.
Reduced models for coupling applications Global and regional highresolution oceanic models are either coupled to an atmospheric model or forced at the airsea interface by fluxes computed empirically preventing proper physical feedback between the two media. Thanks to highresolution observational studies, the existence of airsea interactions at oceanic mesoscales (i.e., at $\mathcal{O}\left(1km\right)$ scales) have been unambiguously shown. Those interactions can be represented in coupled models only if the oceanic and atmospheric models are run on the same highresolution computational grid, and are absent in a forced mode. Fully coupled models at highresolution are seldom used because of their prohibitive computational cost. The derivation of a reduced model as an alternative between a forced mode and the use of a full atmospheric model is an open problem.
Multiphysics coupling often requires iterative methods to obtain a mathematically correct numerical solution. To mitigate the cost of the iterations, we will investigate the possibility of using reducedorder models for the iterative process. We will consider different ways of deriving a reduced model: coarsening of the resolution, degradation of the physics and/or numerical schemes, or simplification of the governing equations. At a mathematical level, we will strive to study the wellposedness and the convergence properties when reduced models are used. Indeed, running an atmospheric model at the same resolution as the ocean model is generally too expensive to be manageable, even for moderate resolution applications. To account for important finescale interactions in the computation of the airsea boundary condition, the objective is to derive a simplified boundary layer model that is able to represent important 3D turbulent features in the marine atmospheric boundary layer.
Reduced models for multiscale optimization The field of multigrid methods for optimisation has known a tremendous development over the past few decades. However, it has not been applied to oceanic and atmospheric problems apart from some crude (nonconverging) approximations or applications to simplified and low dimensional models. This is mainly due to the high complexity of such models and to the difficulty in handling several grids at the same time. Moreover, due to complex boundaries and physical phenomena, the grid interactions and transfer operators are not trivial to define.
Multigrid solvers (or multigrid preconditioners) are efficient methods for the solution of variational data assimilation problems. We would like to take advantage of these methods to tackle the optimization problem in high dimensional space. High dimensional control space is obtained when dealing with parameter fields estimation, or with control of the full 4D (space time) trajectory. It is important since it enables us to take into account model errors. In that case, multigrid methods can be used to solve the large scales of the problem at a lower cost, this being potentially coupled with a scale decomposition of the variables themselves.
3.3 Dealing with uncertainties
There are many sources of uncertainties in numerical models. They are due to imperfect external forcing, poorly known parameters, missing physics and discretization errors. Studying these uncertainties and their impact on the simulations is a challenge, mostly because of the high dimensionality and nonlinear nature of the systems. To deal with these uncertainties we work on three axes of research, which are linked: sensitivity analysis, parameter estimation and risk assessment. They are based on either stochastic or deterministic methods.
Sensitivity analysis Sensitivity analysis (SA), which links uncertainty in the model inputs to uncertainty in the model outputs, is a powerful tool for model design and validation. First, it can be a prestage for parameter estimation (see 3.3), allowing for the selection of the more significant parameters. Second, SA permits understanding and quantifying (possibly nonlinear) interactions induced by the different processes defining e.g., realistic ocean atmosphere models. Finally SA allows for validation of models, checking that the estimated sensitivities are consistent with what is expected by the theory. On ocean, atmosphere and coupled systems, only first order deterministic SA are performed, neglecting the initialization process (data assimilation). AIRSEA members and collaborators proposed to use second order information to provide consistent sensitivity measures, but so far it has only been applied to simple academic systems. Metamodels are now commonly used, due to the cost induced by each evaluation of complex numerical models: mostly Gaussian processes, whose probabilistic framework allows for the development of specific adaptive designs, and polynomial chaos not only in the context of intrusive Galerkin approaches but also in a blackbox approach. Until recently, global SA was based primarily on a set of engineering practices. New mathematical and methodological developments have led to the numerical computation of Sobol' indices, with confidence intervals assessing for both metamodel and estimation errors. Approaches have also been extended to the case of dependent entries, functional inputs and/or output and stochastic numerical codes. Other types of indices and generalizations of Sobol' indices have also been introduced.
Concerning the stochastic approach to SA we plan to work with parameters that show spatiotemporal dependencies and to continue toward more realistic applications where the input space is of huge dimension with highly correlated components. Sensitivity analysis for dependent inputs also introduces new challenges. In our applicative context, it would seem prudent to carefully learn the spatiotemporal dependences before running a global SA. In the deterministic framework we focus on second order approaches where the sought sensitivities are related to the optimality system rather than to the model; i.e., we consider the whole forecasting system (model plus initialization through data assimilation).
All these methods allow for computing sensitivities and more importantly a posteriori error statistics.
Parameter estimation Advanced parameter estimation methods are barely used in ocean, atmosphere and coupled systems, mostly due to a difficulty of deriving adequate response functions, a lack of knowledge of these methods in the oceanatmosphere community, and also to the huge associated computing costs. In the presence of strong uncertainties on the model but also on parameter values, simulation and inference are closely associated. Filtering for data assimilation and Approximate Bayesian Computation (ABC) are two examples of such association.
Stochastic approach can be compared with the deterministic approach, which allows to determine the sensitivity of the flow to parameters and optimize their values relying on data assimilation. This approach is already shown to be capable of selecting a reduced space of the most influent parameters in the local parameter space and to adapt their values in view of correcting errors committed by the numerical approximation. This approach assumes the use of automatic differentiation of the source code with respect to the model parameters, and optimization of the obtained raw code.
AIRSEA assembles all the required expertise to tackle these difficulties. As mentioned previously, the choice of parameterization schemes and their tuning has a significant impact on the result of model simulations. Our research will focus on parameter estimation for parameterized Partial Differential Equations (PDEs) and also for parameterized Stochastic Differential Equations (SDEs). Deterministic approaches are based on optimal control methods and are local in the parameter space (i.e., the result depends on the starting point of the estimation) but thanks to adjoint methods they can cope with a large number of unknowns that can also vary in space and time. Multiscale optimization techniques as described in 8.2 will be one of the tools used. This in turn can be used either to propose a better (and smaller) parameter set or as a criterion for discriminating parameterization schemes. Statistical methods are global in the parameter state but may suffer from the curse of dimensionality. However, the notion of parameter can also be extended to functional parameters. We may consider as parameter a functional entity such as a boundary condition on time, or a probability density function in a stationary regime. For these purposes, nonparametric estimation will also be considered as an alternative.
Risk assessment Risk assessment in the multivariate setting suffers from a lack of consensus on the choice of indicators. Moreover, once the indicators are designed, it still remains to develop estimation procedures, efficient even for high risk levels. Recent developments for the assessment of financial risk have to be considered with caution as methods may differ pertaining to general financial decisions or environmental risk assessment. Modeling and quantifying uncertainties related to extreme events is of central interest in environmental sciences. In relation to our scientific targets, risk assessment is very important in several areas: hydrological extreme events, cyclone intensity, storm surges...Environmental risks most of the time involve several aspects which are often correlated. Moreover, even in the ideal case where the focus is on a single risk source, we have to face the temporal and spatial nature of environmental extreme events. The study of extremes within a spatiotemporal framework remains an emerging field where the development of adapted statistical methods could lead to major progress in terms of geophysical understanding and risk assessment thus coupling data and model information for risk assessment.
Based on the above considerations we aim to answer the following scientific questions: how to measure risk in a multivariate/spatial framework? How to estimate risk in a non stationary context? How to reduce dimension (see 3.2) for a better estimation of spatial risk?
Extreme events are rare, which means there is little data available to make inferences of risk measures. Risk assessment based on observation therefore relies on multivariate extreme value theory. Interacting particle systems for the analysis of rare events is commonly used in the community of computer experiments. An open question is the pertinence of such tools for the evaluation of environmental risk.
Most numerical models are unable to accurately reproduce extreme events. There is therefore a real need to develop efficient assimilation methods for the coupling of numerical models and extreme data.
3.4 High performance computing
Methods for sensitivity analysis, parameter estimation and risk assessment are extremely costly due to the necessary number of model evaluations. This number of simulations require considerable computational resources, depends on the complexity of the application, the number of input variables and desired quality of approximations. To this aim, the AIRSEA team is an intensive user of HPC computing platforms, particularly grid computing platforms. The associated grid deployment has to take into account the scheduling of a huge number of computational requests and the links with datamanagement between these requests, all of these as automatically as possible. In addition, there is an increasing need to propose efficient numerical algorithms specifically designed for new (or future) computing architectures and this is part of our scientific objectives. According to the computational cost of our applications, the evolution of high performance computing platforms has to be taken into account for several reasons. While our applications are able to exploit space parallelism to its full extent (oceanic and atmospheric models are traditionally based on a spatial domain decomposition method), the spatial discretization step size limits the efficiency of traditional parallel methods. Thus the inherent parallelism is modest, particularly for the case of relative coarse resolution but with very long integration time (e.g., climate modeling). Paths toward new programming paradigms are thus needed. As a step in that direction, we plan to focus our research on parallel in time methods.
New numerical algorithms for high performance computing Parallel in time methods can be classified into three main groups. In the first group, we find methods using parallelism across the method, such as parallel integrators for ordinary differential equations. The second group considers parallelism across the problem. Falling into this category are methods such as waveform relaxation where the spacetime system is decomposed into a set of subsystems which can then be solved independently using some form of relaxation techniques or multigrid reduction in time. The third group of methods focuses on parallelism across the steps. One of the best known algorithms in this family is parareal. Other methods combining the strengths of those listed above (e.g., PFASST) are currently under investigation in the community.
Parallel in time methods are iterative methods that may require a large number of iteration before convergence. Our first focus will be on the convergence analysis of parallel in time (Parareal / Schwarz) methods for the equation systems of oceanic and atmospheric models. Our second objective will be on the construction of fast (approximate) integrators for these systems. This part is naturally linked to the model reduction methods of section (8.2.1). Fast approximate integrators are required both in the Schwarz algorithm (where a first guess of the boundary conditions is required) and in the Parareal algorithm (where the fast integrator is used to connect the different time windows). Our main application of these methods will be on climate (i.e., very long time) simulations. Our second application of parallel in time methods will be in the context of optimization methods. In fact, one of the major drawbacks of the optimal control techniques used in 3.3 is a lack of intrinsic parallelism in comparison with ensemble methods. Here, parallel in time methods also offer ways to better efficiency. The mathematical key point is centered on how to efficiently couple two iterative methods (i.e., parallel in time and optimization methods).
4 Application domains
The OceanAtmosphere System
The evolution of natural systems, in the short, mid, or long term, has extremely important consequences for both the global Earth system and humanity. Forecasting this evolution is thus a major challenge from the scientific, economic, and human viewpoints.
Humanity has to face the problem of global warming, brought on by the emission of greenhouse gases from human activities. This warming will probably cause huge changes at global and regional scales, in terms of climate, vegetation and biodiversity, with major consequences for local populations. Research has therefore been conducted over the past 15 to 20 years in an effort to model the Earth's climate and forecast its evolution in the 21st century in response to anthropic action.
With regard to shortterm forecasts, the best and oldest example is of course weather forecasting. Meteorological services have been providing daily shortterm forecasts for several decades which are of crucial importance for numerous human activities.
Numerous other problems can also be mentioned, like seasonal weather forecasting (to enable powerful phenomena like an El Ni$\tilde{\text{n}}$o event or a drought period to be anticipated a few months in advance), operational oceanography (shortterm forecasts of the evolution of the ocean system to provide services for the fishing industry, ship routing, defense, or the fight against marine pollution) or the prediction of floods.
As mentioned previously, mathematical and numerical tools are omnipresent and play a fundamental role in these areas of research. In this context, the vocation of AIRSEA is not to carry out numerical prediction, but to address mathematical issues raised by the development of prediction systems for these application fields, in close collaboration with geophysicists.
5 Social and environmental responsibility
Most of the research activities of the AIRSEA team are directed towards the improvement of numerical systems of the ocean and the atmosphere. This includes the development of appropriate numerical methods, model/parameter calibration using observational data and uncertainty quantification for decision making. The AIRSEA team members work in close collaboration with the researchers in the field of geophyscial fluid and are partners of several interdisciplinary projects. They also strongly contribute to the development of state of the art numerical systems, like NEMO and CROCO in the ocean community.
6 Highlights of the year
6.1 Outreach
 Clémentine Prieur was selected as one of the ambassador for "La Science taille XX elles" in Grenoble xxlgrenoble.sciencesconf.org
 The AIRSEA team organized the workshop on Partial Differential Equations on the sphere (PDEs). This is a workshop specialized on numerical solution techniques on highperformance computer architectures of the partial differential equations for weather, climate and ocean simulations.
 2023 was a year of evaluation for our projectteam. This evaluation, coordinated by Inria's Evaluation Committee, is an important step in the team's life cycle. In particular, it enabled us to elaborate collectively on the future of all teams involved in the environmental sciences theme and beyond. The subsequent foresight seminar was the opportunity to discuss all this among ourlseves and with external experts. To present some of them we produced a summary, which raises more questions than it answers, but which reflects the topic's complexity and the debate that has opened up within the institute itself.
6.2 Awards
 A paper coauthored by F. Lemarié, entitled “Reconciling and Improving Formulations for Thermodynamics and Conservation Principles in Earth System Models (ESMs)”, has been awarded the 2023 UCAR Outstanding Accomplishment Award for Publication. The award was presented by the University Corporation for Atmospheric Research in November 2023.
 A poster coauthored by M.Schreiber on exponential time integration with the wave equation 36 received the the Best Poster Award.
7 New software, platforms, open data
7.1 New software
7.1.1 AGRIF

Name:
Adaptive Grid Refinement In Fortran

Keyword:
Mesh refinement

Scientific Description:
AGRIF is a Fortran 90 package for the integration of full adaptive mesh refinement (AMR) features within a multidimensional finite difference model written in Fortran. Its main objective is to simplify the integration of AMR potentialities within an existing model with minimal changes. Capabilities of this package include the management of an arbitrary number of grids, horizontal and/or vertical refinements, dynamic regridding, parallelization of the grids interactions on distributed memory computers. AGRIF requires the model to be discretized on a structured grid, like it is typically done in ocean or atmosphere modelling.

Functional Description:
AGRIF is a Fortran 90 package for the integration of full adaptive mesh refinement (AMR) features within a multidimensional finite difference model written in Fortran. Its main objective is to simplify the integration of AMR potentialities within an existing model with minimal changes. Capabilities of this package include the management of an arbitrary number of grids, horizontal and/or vertical refinements, dynamic regridding, parallelization of the grids interactions on distributed memory computers. AGRIF requires the model to be discretized on a structured grid, like it is typically done in ocean or atmosphere modelling.

News of the Year:
Within the framework of a European Copernicus contract, improvements have been made to the management of parallelization ( assignment of processors to computational grids).
 URL:
 Publications:

Contact:
Laurent Debreu

Participant:
Laurent Debreu
7.1.2 NEMOVAR

Name:
Variational data assimilation for NEMO

Keywords:
Oceanography, Data assimilation, Adjoint method, Optimal control

Functional Description:
NEMOVAR is a stateoftheart multiincremental variational data assimilation system with both 3D and 4D var capabilities, and which is designed to work with NEMO on the native ORCA grids. The background error covariance matrix is modelled using balance operators for the multivariate component and a diffusion operator for the univariate component. It can also be formulated as a linear combination of covariance models to take into account multiple correlation length scales associated with ocean variability on different scales. NEMOVAR has recently been enhanced with the addition of ensemble data assimilation and multigrid assimilation capabilities. It is used operationnaly in both ECMWF and the Met Office (UK)

Contact:
Patrick Vidard

Partners:
CERFACS, ECMWF, Met Office
7.1.3 SWEET

Name:
Shallow Water Equation Environment for Tests, Awesome!

Keywords:
HighPerformance Computing, Time integration methods

Functional Description:
SWEET supports periodic boundary conditions for * the biperiodic plane (2D torus) * the sphere
Space discretization * PLANE: Spectral methods based on Fourier space * PLANE: Finite differences * SPHERE: Spherical Harmonics
Time discretization * Explicit RK * Implicit RK * Leapfrog * CrankNicolson * SemiLagrangian * Parallelintime ** Parareal ** PFASST ** Rational approximation of exponential Integrators (REXI) ...and many more time steppers...
Special features * Graphical user interface * Fast Helmholtz solver in spectral space * Easytocode in C++ ...
There’s support for various applications * Shallowwater equations on plane/sphere * Advection * Burgers’ ...
 URL:

Contact:
Martin Schreiber

Partners:
University of São Paulo, Technical University of Munich (TUM)
8 New results
8.1 Modeling for Oceanic and Atmospheric flows
8.1.1 Numerical Schemes for Ocean Modelling
Participants: Eric Blayo, Laurent Debreu, Florian Lemarié, Gurvan Madec, Antoine Nasser, Pierre Lozano.
Dealing with complex geometries
Accurate and stable implementation of bathymetry boundary conditions in ocean models remains a challenging problem. Generalized terrainfollowing coordinates are often used in ocean models, but they require smoothing the bathymetry to reduce pressure gradient errors. Geopotential zcoordinates are a common alternative that avoid pressure gradient and numerical diapycnal diffusion errors, but they generate spurious flow due to their “staircase” geometry. In 44, we introduce a new Brinkman volume penalization to approximate the noslip boundary condition and complex geometry of bathymetry in ocean models. This approach corrects the staircase effect of zcoordinates, does not introduce any new stability constraints on the geometry of the bathymetry and is easy to implement in an existing ocean model. The porosity parameter allows modelling subgrid scale details of the geometry. As an illustration, through the use of penalization methods, the Gulf Stream detachment is correctly represented in a 1/8 degree simulation (see 1). These new results on realistic applications have been published in 45. This opens the door to a clear improvement of climate models in which a good representation of this mechanism is essential. This work has been extended to z coordinate ocean models through the PhD work of A. Nasser 21. We have also investigated the representation of coastlines and its sensitivity to the mesh orientation in 7.
Beyond the hydrostatic assumption
With the increase of resolution, the hydrostatic assumption becomes less valid and the AIRSEA group also works on the development of nonhydrostatic ocean models. The treatment of nonhydrostatic incompressible flows leads to a 3D elliptic system for pressure that can be illconditioned, in particular with non geopotential vertical coordinates. That is why we favor the use of the nonhydrostatic compressible equations, that remove the need for a 3D resolution at the price of reincluding acoustic waves. For that purposes a detailed analysis of acousticgravity waves in a freesurface compressible and stratified ocean was carried out 37 in part in the PhD of E. Duval. The proposed numerical approach has been implemented in the CROCO ocean model and tested in various flow configurations 56, 66.
Most large scale ocean models are based on the socalled “primitive equations”, which use the hydrostatic and incompressibility assumptions. However, with the increase of resolution, a systematic use of the hydrostatic assumption becomes less valid. The French regional oceanic modeling system CROCO (Coastal and Regional Ocean COmmunity model) developed these last years allows for the use of either the hydrostatic incompressible (HI) equations and the nonhydrostatic compressible (NHC) equations, the latter being much more computationally expensive. A natural idea is thus to limit the use of the NHC version to some particular regions of interest where the hydrostatic assumption is not relevant, and to nest such local NHC zooms within a larger model using the HI version. However such a coupling is quite delicate from a mathematical point of view, due to the different nature of hydrostatic and nonhydrostatic equations (where the vertical velocity is either a diagnostic or a prognostic variable). Following E. Duval's first work during her PhD, P. Lozano is working on the design of methods to couple local nonhydrostatic models to larger scale hydrostatic ones. An interesting lead consists in exploiting a decomposition of the solutions into vertical modes, possibly in association with Perfectly Matched Layers techniques. These first ideas are currently being tested in a prototype which allows for analytical solutions in simplified configurations.
$1/{4}^{\circ}$  $1/{8}^{\circ}$  $1/{12}^{\circ}$  
$\sigma $ 



$\sigma $ penalized 



AVISO 

Mean sea surface height in CROCO simulations at different resolutions for the standard case with terrainfollowing ($\sigma $) coordinates (top) and penalization (below). The third row shows the AVISO product for comparison
8.1.2 Coupling Methods for Oceanic and Atmospheric Models and representation of the AirSea Interface
Participants: Eric Blayo, Simon Clément, Hugues Lascombes De Laroussilhe, Florian Lemarié, Valentina Schüller.
The Airsea team is involved in the modeling and algorithmic aspects of oceanatmosphere (OA) coupling. We have been actively working on the analysis of such coupling both in terms of continuous and numerical formulations. Particular attention is paid to the inclusion of physical parameterizations in our theoretical framework. Our activities have recently led to practical implementations in stateoftheart oceanic and Earth system models. Our focus during the last few years has been on the following topics :
 Continuous and discrete analysis of Schwarz algorithms for OA coupling Members of the Airsea team have been developing coupling approaches for several years, based on socalled Schwarz algorithms. Schwarzlike domain decomposition methods are very popular in mathematics, computational sciences and engineering notably for the implementation of coupling strategies. However, for complex applications (like in OA coupling) it is challenging to have an a priori knowledge of the convergence properties of such methods. Indeed coupled problems arising in Earth system modeling often exhibit sharp turbulent boundary layers whose parameterizations lead to peculiar transmission conditions and diffusion coefficients 78. In the framework of S. Clément PhD 43, the wellposedness of the nonlinear coupling problem including parameterizations has been addressed and a detailed continuous and discrete analysis of the convergence properties of the Schwarz methods has been pursued to entangle the impact of the different parameters at play in such coupling problem 42, 11. A general framework has been proposed to study the convergence properties at a (semi)discrete level to allow a systematic comparison with the results obtained from the continuous problem. Such a framework allows to study more complex coupling problems whose formulation is representative of the discretization used in realistic coupled models.
 A simplified atmospheric boundary layer model for oceanic purposes Part of our activities within the ongoing SHOM 19CP07 project is dedicated to the development of a simplified model of the marine atmospheric boundary layer (called ABL1d) of intermediate complexity between a bulk parameterization and a full threedimensional atmospheric model and to its integration to the NEMO general circulation model 64. A constraint in the conception of such a simplified model is to allow an apt representation of the downward momentum mixing mechanism and partial reenergization of the ocean by the atmosphere while keeping the computational efficiency and flexibility inherent to ocean only modeling. Realistic applications of the coupled NEMOABL1d modeling system have been carried out and the methodology is being integrated into the operational forecasting system operated by MercatorOcean. Over the last year the approach has also been implemented in the CROCO ocean model. A focus has been to find adequate ways to fill some gaps in the 1D approach using multiple scales asymptotic techniques to cast the equations in terms of perturbations around an ambient state given by a largescale datasets. Such simplified model called ABL3d leads to clear improvements over ABL1d for academic semiidealized cases. The objective is now to extend the analysis to realistic cases. In parallel, in the framework of the AIRSEA/ATOS collaboration, an objective is to design a surrogate via learning strategies of the response of the atmospheric boundary layer to anomalies in ocean surface temperatures and currents (work of H. Lascombes De Laroussilhe).
 Impact of the coupling formulation in a realistic context A Schwarzlike iterative method has been applied in a stateoftheart EarthSystem model (IPSLCM6) to evaluate the consequences of inaccuracies in the usual adhoc oceanatmosphere coupling algorithms used in realistic models 67, 68. Numerical results obtained with an iterative process show large differences at sunrise and sunset compared to usual adhoc algorithms, thus showing that synchrony errors inherent to adhoc coupling methods can be large. However, such an iterative coupling method is too costly to be implemented operationally in climate models. In order to keep the computational cost almost constant w.r.t. the usual non iterative approach, the iterative algorithm would need to be provided with a first guess close to the optimum, so as to achieve quasiconvergence in a single iteration. We aim to obtain such an approximation of the optimal state by learning techniques (work of A. Monsimer as part of the programme national de recherche en intelligence artificielle  PNRIA). A learning network based on a test dataset restricted to a few fixed regions has been set up, and is indeed able to provide a first iteration of good quality. The remaining step is to scale up to a global scale, with a view to making the method operationally applicable to a climate model. An other point is that for the simulations with IPSLCM6 the iterative process was only applied over the ocean as it converges very slowly over seaice. We started to study the specific convergence problem occurring over seaice during the internship of P. Lozano. It allowed to explain the slow convergence and proposed ways around it. The next step is to use the single column version of ECEarth climate model to make the link between our theoretical work and practical applications. On these last point, a collaboration has been set up with the University of Lund (Sweden) as part of V. Schüller's thesis.
These topics are addressed through strong collaborations between the applied mathematicians and the climate and operational community (MeteoFrance, Ifremer, SHOM, MercatorOcean, LMD, and LOCEAN). Airsea team members play a major role in the structuration of a multidisciplinary scientific community working on oceanatmosphere coupling spanning a broad range from mathematical theory to practical implementations in climate and operational models.
8.1.3 PhysicsDynamics coupling: Consistent subgridscale modeling
Participants: Eric Blayo, Simon Clément, Florian Lemarié, Manolis Perrot.
The AIRSEA team works on topics around physicsdynamics coupling 52. Schematically, numerical models consist of two blocks generally identified as “physics” and “dynamics” which are often developed separately. The “Physics” represents unresolved or underresolved processes with typical scales below model resolution while the “dynamics” corresponds to a discrete representation in space and time of resolved processes. Unresolved processes cannot be ignored because they directly influence the resolved part of the flow since energy is continuously transferred between scales. The interplay between resolved and unresolved scales is a large, incomplete and complex topic for which there is still much to do within the Earth system modeling community 63. During the last year we worked on the following topics :
 Representation of the airsea interface in coupled models In the PhD work of S. Clément 43 a finitevolume discretization consistent with our knowledge of the underlying physical principles (e.g. the MoninObukhov theory in the surface layer) has been derived 12. This work will be taken up again as part of a Master's internship to solidify the theoretical basis and move on to practical assessments.
 Representation of penetrative convection in oceanic models Accounting for the mean effect of subgrid scale intermittent coherent structures like convective plumes is very challenging. Currently this is done very crudely in ocean models (vertical diffusion is locally increased to “mix” unstable density profiles). A difficulty is that in convective conditions, turbulent fluxes are dominated by processes unrelated to local gradients, thus invalidating the usual downgradient (a.k.a. eddydiffusion) approach. In the framework of the PhD of M. Perrot, a first step is to study the derivation of massflux convection schemes arising from a multifluid decomposition to extend them specifically to the oceanic context. This extension is done under certain “consistency" constraints: energetic considerations and scaleawareness of the resulting model. Reference LES simulations have been developed to guide the formulation of unknown/uncertain free parameters (coefficients or functions) in the proposed extended massflux scheme. The Bayesian calibration of such free parameters will be undertaken.
 Partially Lagrangian implementation of location uncertainty Recent oceanic parameterizations "under Location Uncertainty" are based on the hypothesis that the smallscale processes are uncorrelated in time. The implementation of such parameterizations can be done in a Lagrangian manner, with rapidly moving grid points. The possibility of keeping the grid close to its original disposition is studied by S. Clement to understand how the time correlation induced by this constraint can be compensated by an Eulerian term.
Those topics are addressed through collaborations with the climate and operational community (MeteoFrance, SHOM, MercatorOcean, and IGE). Two projects are currently funded, one on the energetically consistent discretization aspect (SHOM 19CP07, 20202024, PI: F. Lemarié) and one on the convection parameterization (Institut des Mathématiques pour la Planète Terre, 20212024, PIs: F. Lemarié and G. Madec). Furthermore, an ANR project, in which the Airsea team is involved, entitled PLUME was selected for funding in the latest ANR call. One of the objectives of this project is to use LES numerical simulations and laboratory experiments of deep convection to calibrate and evaluate physical parameterizations.
8.1.4 Machine learning for reconstruction of model parameters.
Participants: Laurent Debreu, Eugene Kazantsev, Arthur Vidard, Olivier Zahm.
Artificial intelligence and machine learning may be considered as a potential way to address unresolved model scales and to approximate poorly known processes such as dissipation that occurs essentially at small scales. In order to understand the possibility to combine numerical model and neural network learned with the aid of external data, we develop a network generation and learning algorithm and use it to approximate nonlinear model operators.
A potential way to reconstruct subgrid scales consists in application of the Image SuperResolution methods that refer to the process of recovering highresolution images from lowresolution image in computer vision and image processing. Recent years have shown remarkable progress of image superresolution using machine learning techniques 79. We try to use this methodology in order to identify fine structure of the chaotic turbulent solution of a simple barotropic ocean model. After the learning the flow patterns obtained by the high resolution model, the neuron net can identify fine structure in the lowresolution model solution with better precision than bicubic interpolation.
Different techniques of neuron net construction have been analyzed. Fullconnected networks, basic convolutional and encoderdecoder processes 61 as well as mixed architectures were compared with each other and with the classical interpolation of model solution on a lowresolution grid.
8.2 Model reduction / multiscale algorithms
The high computational cost of complex numerical simulations is a common and major concern when deriving new methodological approaches. This cost increases dramatically with the use of sensitivity analysis or parameter estimation methods, and more generally with any method requiring numerous model integrations. Model reduction, using either stochastic or deterministic methods, is a way to reduce significantly the computing time of a numerical model. Over the past year, our team focused on different reduction aspects, emphasized and described below.
8.2.1 Model order reduction
Participants: Clémentine Prieur, Olivier Zahm, Romain Verdière.
In 32 we propose a gradientenhanced algorithm for highdimensional function approximation. The algorithm proceeds in two steps: firstly, we reduce the input dimension by learning the relevant input features from gradient evaluations, and secondly, we regress the function output against the prelearned features. To ensure theoretical guarantees, we construct the feature map as the first components of a diffeomorphism, which we learn by minimizing an error bound obtained using Poincaré Inequality applied either in the input space or in the feature space. This leads to two different strategies, which we compare both theoretically and numerically and relate to existing methods in the literature. In addition, we propose a dimension augmentation trick to increase the approximation power of feature detection. A generalization to vectorvalued functions demonstrate that our methodology directly applies to learning autoencoders. Here, we approximate the identity function over a given dataset by a composition of feature map (encoder) with the regression function (decoder). In practice, we construct the diffeomorphism using coupling flows, a particular class of invertible neural networks. Numerical experiments on various highdimensional functions show that the proposed algorithm outperforms stateoftheart competitors, especially with small datasets.
In a joint work 69 with Didier Georges (GIPSA Lab, Grenoble) and Mathieu Oliver (internship student), we proposed a spatialized extension of a SIR model that accounts for undetected infections and recoveries as well as the load on hospital services. The spatialized compartmental model we introduced is governed by a set of partial differential equations (PDEs) defined on a spatial domain with complex boundary. We proposed to solve the set of PDEs defining our model by using a meshless numerical method based on a finite difference scheme in which the spatial operators were approximated by using radial basis functions. Then we calibrated our model on the French department of Isère during the first period of lockdown, using daily reports of hospital occupancy in France. Our methodology allowed to simulate the spread of Covid19 pandemic at a departmental level, and for each compartment. However, the simulation cost prevented from online shortterm forecast. Therefore, we proposed to rely on reduced order modeling tools to compute shortterm forecasts of infection number. The strategy consisted in learning a timedependent reduced order model with few compartments from a collection of evaluations of our spatialized detailed model, varying initial conditions and parameter values. A set of reduced bases was learnt in an offline phase while the projection on each reduced basis and the selection of the best projection was performed online, allowing shortterm forecast of the global number of infected individuals in the department. This work is going on in the framework of Robin Vaudry's PhD (cosupervised with Didier Georges). We are investigating the more complex setting of spatialized models taking into account the vaccination and the loss of immunity.
In the framework of Arthur Macherey's PhD (defended in June 2021), in collaboration with Anthony Nouy and Marie BillaudFriess (Ecole Centrale Nantes), we have proposed algorithms for solving highdimensional Partial Differential Equations (PDEs) that combine a probabilistic interpretation of PDEs, through FeynmanKac representation, with sparse interpolation 40. MonteCarlo methods and timeintegration schemes are used to estimate pointwise evaluations of the solution of a PDE. We use a sequential control variates algorithm, where control variates are constructed based on successive approximations of the solution of the PDE. Then we turned to parametrized PDE and proposed stochastic algorithms in the framework of potentially high dimensional parameter space 23. A preliminary step was the development of a PAC algorithm in relative precision for bandit problem with costly sampling 41.
External collaborators: Arthur Macherey, Didier Georges and Matthieu Oliver
8.2.2 Design of experiments, climate scenarios and ocean simulation
Participants: Elise Arnaud, Eric Blayo, Angélique Saillet.
In the current context of rapid climate change, numerical models are important tools for predicting climate change and assisting decision making by policy makers (e.g. in terms of protection of marine areas, land use or definition of fishing quotas). The huge complexity of the models and the generally very high cost of numerical simulations make an exhaustive exploration of the parameter space, corresponding to all possible scenarios and all model internal options, completely illusory. The idea is therefore to exploit statistical tools for the design of experiments. These tools make it possible to identify specific combinations that provide maximum information on a given quantity of interest (for example an indicator of ecosystem health) calculated from the simulation performed. The design of experiments also has the advantage that it can be built adaptively, with the aim of taking into account the results of preexisting simulations, performed with various models, under various scenarios. In Angelique Saillet's PhD, we aim at developing methodologies to address this open research domain, exploiting theoretical tools such as sequential design of experiments, enrichment strategies, and multifidelity Gaussian process regression.
8.2.3 Multifidelity Variational Data assimilation
Participants: Arthur Vidard, Hélène Hénon.
Incremental Variational Data Assimilation addresses the nonlinear leastsquare optimization challenge inherent in variational data assimilation by iteratively minimizing a sequence of linear leastsquares cost functions. In the context of Hélène Hénon's PhD research, we explore the potential application of a multifidelity approach to tackle these linear leastsquare problems. This involves considering varying levels of fidelity or accuracy in the computational models, offering a nuanced strategy to enhance the efficiency and effectiveness of the iterative optimization process.
8.3 Parameter estimation and robust inversion
8.3.1 Transportbased density estimation for inverse problems
Participants: Olivier Zahm.
Transport map methods offer a powerful statistical learning tool that can couple a target highdimensional random variable with some reference random variable using invertible transformations. The paper 24 presents new computational techniques for building the Knothe–Rosenblatt (KR) rearrangement based on general separable functions. We first introduce a new construction of the KR rearrangement—with guaranteed invertibility in its numerical implementation—based on approximating the density of the target random variable using tensorproduct spectral polynomials and downward closed sparse index sets. Compared to other constructions of KR arrangements based on either multilinear approximations or nonlinear optimizations, our new construction only relies on a weighted least square approximation procedure. Then, inspired by the recently developed deep tensor trains (Cui and Dolgov, Found. Comput. Math. 22:1863–1922, 2022), we enhance the approximation power of sparse polynomials by preconditioning the density approximation problem using compositions of maps. This is particularly suitable for highdimensional and concentrated probability densities commonly seen in many applications. We approximate the complicated target density by a composition of selfreinforced KR rearrangements, in which previously constructed KR rearrangements—based on the same approximation ansatz—are used to precondition the density approximation problem for building each new KR rearrangement. We demonstrate the efficiency of our proposed methods and the importance of using the composite map on several inverse problems governed by ordinary differential equations (ODEs) and partial differential equations (PDEs).
In the paper 3 we present a novel oﬄineonline method to mitigate the computational burden of the characterization of posterior random variables in statistical learning. In the oﬄine phase, the proposed method learns the joint law of the parameter random variables and the observable random variables in the tensortrain (TT) format. In the online phase, the resulting orderpreserving conditional transport can characterize the posterior random variables given newly observed data in real time. Compared with the stateoftheart normalizing ﬂow techniques, the proposed method relies on function approximation and is equipped with a thorough performance analysis. The function approximation perspective also allows us to further extend the capability of transport maps in challenging problems with highdimensional observations and highdimensional parameters. On the one hand, we present novel heuristics to reorder and/or reparametrize the variables to enhance the approximation power of TT. On the other hand, we integrate the TTbased transport maps and the parameter reordering/reparametrization into layered compositions to further improve the performance of the resulting transport maps. We demonstrate the eﬃciency of the proposed method on various statistical learning tasks in ordinary differential equations (ODEs) and partial differential equations (PDEs).
In the paper 1, we address the problem of the parametrization and the learning of monotone triangular transport maps. Transportation of measure provides a versatile approach for modeling complex probability distributions, with applications in density estimation, Bayesian inference, generative modeling, and beyond. Monotone triangular transport maps—approximations of the Knothe–Rosenblatt (KR) rearrangement are a canonical choice for these tasks. Yet the representation and parameterization of such maps have a significant impact on their generality and expressiveness, and on properties of the optimization problem that arises in learning a map from data (e.g., via maximum likelihood estimation). We present a general framework for representing monotone triangular maps via invertible transformations of smooth functions. We establish conditions on the transformation such that the associated infinitedimensional minimization problem has no spurious local minima, i.e., all local minima are global minima; and we show for target distributions satisfying certain tail conditions that the unique global minimizer corresponds to the KR map. Given a sample from the target, we then propose an adaptive algorithm that estimates a sparse semiparametric approximation of the underlying KR map. We demonstrate how this framework can be applied to joint and conditional density estimation, likelihoodfree inference, and structure learning of directed graphical models, with stable generalization performance across a range of sample sizes.
External collaborators: Sergey Dolgov, Tiangang Cui, Youssef Marzouk, Ricardo Baptista
8.3.2 Dimension reduction of highdimensional inference problems
Participants: Olivier Zahm, Rafael Flock, Clément Duhamel, Clémentine Prieur, Qiao Chen, Elise Arnaud.
In the paper 28 we investigate the approximation of highdimensional target measures as lowdimensional updates of a dominating reference measure. This approximation class replaces the associated density with the composition of: (i) a feature map that identifies the leading principal components or features of the target measure, relative to the reference, and (ii) a lowdimensional profile function. When the reference measure satisfies a subspace $\phi $Sobolev inequality, we construct a computationally tractable approximation that yields certifiable error guarantees with respect to the Amari $\alpha $divergences. Our construction proceeds in two stages. First, for any feature map and any $\alpha $divergence, we obtain an analytical expression for the optimal profile function. Second, for linear feature maps, the principal features are obtained from eigenvectors of a matrix involving gradients of the logdensity. Neither step requires explicit access to normalizing constants. Notably, by leveraging the $\phi $Sobolev inequalities, we demonstrate that these features universally certify approximation errors across the range of $\alpha $divergences $\alpha \in (0,1]$. We then propose an application to Bayesian inverse problems and provide an analogous construction with approximation guarantees that hold in expectation over the data. We conclude with an extension of the proposed dimension reduction strategy to nonlinear feature maps.
In the paper 26 we consider highdimensional Bayesian inverse problems with arbitrary likelihood and productform Laplace prior for which we provide a certified approximation of the posterior density in the Hellinger distance. The approximate posterior density differs from the prior density only in a small number of relevant coordinates that contribute the most to the update from the prior to the posterior. We propose and analyze a gradientbased diagnostic to identify these relevant coordinates. Although this diagnostic requires computing an expectation with respect to the posterior, we propose tractable methods for the classical case of a linear forward model with Gaussian likelihood. Our methods can be employed to estimate the diagnostic before solving the Bayesian inverse problem via, e.g., Markov chain Monte Carlo (MCMC) methods. After selecting the coordinates, the approximate posterior density can be efficiently inferred since most of its coordinates are only informed by the prior. Moreover, specialized MCMC methods, such as the pseudomarginal MCMC algorithm, can be used to obtain less correlated samples when sampling the exact posterior density. We show the applicability of our method using a 1D signal deblurring problem and a highdimensional 2D superresolution problem. In Qiao Chen's PhD project, we propose novel approach to jointly reducing parameter and data dimensions in Bayesian inverse problems, addressing a gap in existing research that typically treats these reductions separately. The proposed method leverages gradient information to ensure controlled posterior approximation errors and enables goaloriented dimension reduction by specifying a parameter subspace of interest. An efficient "Alternating Eigendecomposition" algorithm is presented for solving both coupled and goaloriented problems. Demonstrating applicability in Bayesian optimal experimental design, the method connects goaloriented data dimension reduction to maximizing the expected information gain. This overcomes the limitations of existing approaches by accommodating nonlinear models and non Gaussian posteriors. Additionally, the method offers a relaxed perspective on the NPhard combinatorial optimization problem in optimal experimental design, paving the way for the identification of optimal subspaces and the use of advanced sampling techniques for sensor placement.
Reduced models are also developed in the framework of robust inversion. In 46, we have combined a new greedy algorithm for functional quantization with a Stepwise Uncertainty Reduction strategy to solve a robust inversion problem under functional uncertainties. In a more recent work, we further reduced the number of simulations required to solve the same robust inversion problem, based on Gaussian process metamodeling on the joint input space of deterministic control parameters and functional uncertain variable 47. These results are applied to automotive depollution. This research axis was conducted in the framework of the Chair OQUAIDO. This research axis is till active in the team through Clément Duhamel's PhD, in collaboration with Céline Helbert (Ecole Centrale Lyon) and Miguel Munoz Zuniga, Delphine Sinoquet (IFPEN, Rueil Malmaison). In particular in 4 we proposed a SUR version of the Bichon criterion for excursion set estimation.
External collaborators: Youssef Marzouk, Matthew Li, Yiqiu Dong, Felipe Uribe
8.3.3 Machine Learning based preconditioners for variational data assimilation
Participants: Victor Trappler, Arthur Vidard.
In Variational Data Assimilation, the analysis step boils down to solving a highdimensional nonlinear leastsquares problem. In practical terms, this minimization process involves iterative inversions of large matrices, which may be illconditioned, formed through linearizations of the forward model. To enhance the convergence rate of these methods and thereby reduce computational demands, preconditioning techniques are commonly employed to obtain betterconditioned matrices. However, these techniques typically rely on either the sparsity pattern of the matrix to be inverted or some spectral information.
In our approach, we propose the utilization of Deep Neural Networks to construct a preconditioner. This preconditioner is trained using properties derived from the singular value decomposition, and the training dataset can be dynamically constructed online as needed. This innovative approach aims to improve the efficiency of the inversion process and address the challenges posed by the high dimensionality and potential illconditioning of the matrices involved. This work was presented in two conferences 35, 34.
8.4 Sensitivity analysis
Participants: Clémentine Prieur.
Scientific context
Forecasting geophysical systems require complex models, which sometimes need to be coupled, and which make use of data assimilation. The objective of this project is, for a given output of such a system, to identify the most influential parameters, and to evaluate the effect of uncertainty in input parameters on model output. Existing stochastic tools are not well suited for high dimension problems (in particular timedependent problems), while deterministic tools are fully applicable but only provide limited information. So the challenge is to gather expertise on one hand on numerical approximation and control of Partial Differential Equations, and on the other hand on stochastic methods for sensitivity analysis, in order to develop and design innovative stochastic solutions to study high dimension models and to propose new hybrid approaches combining the stochastic and deterministic methods. We took part to the writing of a position paper on the future of sensitivity analysis 75.
8.4.1 Global sensitivity analysis with dependent inputs
An important challenge for stochastic sensitivity analysis is to develop methodologies which work for dependent inputs. Recently, the Shapley value, from econometrics, was proposed as an alternative to quantify the importance of random input variables to a function. Owen 71 derived Shapley value importance for independent inputs and showed that it is bracketed between two different Sobol' indices. Song et al. 77 recently advocated the use of Shapley value for the case of dependent inputs. In a recent work 70, in collaboration with Art Owen (Standford's University), we showed that Shapley value removes the conceptual problems of functional ANOVA for dependent inputs. We also investigated further the properties of Shapley effects in 60. By the end of 2021, Clémentine Prieur started a collaboration with Elmar Plischke (TU Clausthal, Germany) and Emanuele Borgonovo (Bocconi University, Milan, Italy) to estimate total Sobol' indices as a measure for variable selection even in the framework of dependent inputs. In particular, it allows to estimate total Sobol' indices for inputs defined on a non rectangular domain 2. This setting is of particular interest for applications where the input space is reduced due to physical constraints on the quantity of interest. This last setting was encountered, e.g., in Maria Belén Heredia's PhD thesis (defended in december, 2020), and analyzed by estimating Shapley effects with a nonparametric procedure based on nearest neighbors 38 (see Section 8.4.3 for more details). In October 2021, Ri Wang has started a PhD, cosupervised by Clémentine Prieur and Véronique MaumeDeschamps (ICJ, Lyon 1) on the estimation of quantile oriented sensitivity indices in the framework of dependent inputs, by means of random forests or other machine learning tools. Ri Wang has received a funding from the Chinese Scientific Council.
External collaborators: Maria Belén Heredia, Adrien Hirvoas, Alexandre Janon
8.4.2 Sensitivity analysis in pesticide transfer models
Participants: Arthur Vidard, Emilie Rouzies, Katarina Radisic.
Pesticide transfer models play a crucial role in predicting and preventing water body pollution. However, deploying these models in operational settings demands a thorough understanding of their structure, particularly the influential parameters. The objective of E. Rouziès PhD (defended in march 2023 76)) was to conduct a global sensitivity analysis (GSA) on the PESHMELBA model (pesticide and hydrology: modeling at the catchment scale). The challenge lies in the modular and intricate structure of the model, which couples various physical processes, resulting in a highdimensional input space and significant computational costs that limit the number of feasible runs. An extensive use of different metamodelling techniques made it feasible for such an application 9.
The complexity is further compounded when considering the temporal dimension. We explore this aspect in K. Radisic's PhD, using polynomial chaos extension to conduct a global sensitivity analysis on the small agricultural catchment of Morcille during a winter rainstorm, as detailed in 8. This study expanded to include a comparison of sensitivity analyses under different rainfall forcings and proposed a global index that considers rainfall stochasticity 17.
These initial steps paved the way for developing a method for robust calibration through stochastic metamodelling of PESHMELBA, utilizing stochastic polynomial chaos expansion. Subsequent enhancements were observed by refining the inference of marginal distributions through a Gaussian mixture.
8.4.3 Green sensitivity for multivariate and functional outputs
Participants: Clémentine Prieur.
Another research direction for global SA algorithm starts with the report that most of the algorithms to compute sensitivity measures require special sampling schemes or additional model evaluations so that available data from previous model runs (e.g., from an uncertainty analysis based on Latin Hypercube Sampling) cannot be reused. One challenging task for estimating global sensitivity measures consists in recycling an available finite set of input/output data. Green sensitivity, by recycling, avoids wasting. These given data have been discussed, e.g., in 72, 73. Most of the given data procedures depend on parameters (number of bins, truncation argument…) not easy to calibrate with a biasvariance compromise perspective. Adaptive selection of these parameters remains a challenging issue for most of these givendata algorithms. In the context of María Belén Heredia's PhD thesis, we have proposed 39 a nonparametric given data estimator for aggregated Sobol' indices, introduced in 62 and further developed in 50 for multivariate or functional outputs. We also introduced aggregated Shapley effects and we have extended a nearest neighbor estimation procedure to estimate these indices 38. We also started a collaboration with Sébastien Da Veiga (Safran Tech), Agnès Lagnoux, Thierry Klein and Fabrice Gamboa (Institut de Mathématiques de Toulouse) on a new nonparametric estimation procedure for closed Sobol' indices of any order based on Ustatistics 25.
8.4.4 Global sensitivity analysis for parametrized stochastic differential equations
Participants: Clémentine Prieur.
Many models are stochastic in nature, and some of them may be driven by parametrized stochastic differential equations (SDE). It is important for applications to propose a strategy to perform global sensitivity analysis (GSA) for such models, in presence of uncertainties on the parameters. In collaboration with Pierre Etoré (DATA department in Grenoble), Clémentine Prieur proposed an approach based on FeynmanKac formulas 48. The research on GSA for stochastic simulators is still ongoing, first in the context of the MATHAmSud project FANTASTIC (Statistical inFerence and sensitivity ANalysis for models described by sTochASTIC differential equations) with Chile and Uruguay, secondly through the PhD thesis of Henri Mermoz Kouye, cosupervised by Clémentine Prieur, in collaboration with Gildas Mazo and Eliza Vergu (INRAE, département MIA, Jouy). Note that our recent developments with P. Etoré on GSA for parametrized SDEs and are strongly related to reduced order modeling (see Section 8.2), as GSA requires jose leon intensive computations of the quantity of interest. In collaboration with Pierre Etoré and Joël Andrepont (master internship started in spring 2021), Clémentine Prieur is working on GSA for parametrized SDEs based on FokkerPlanck equation and kernel based sensitivity indices. Note that a joint work between Pierre Etoré, Clémentine Prieur and Jose R. Leon has been published 6, related to exact or approximated computation of Kolmogorov hypoelliptic equations (KHE). Even if not dealing with GSA, it could be a starting point for analyzing sensitivity for models described by a parametrized version of KHE. Concerning Henri Mermoz Kouye's PhD thesis, the approach was different. We are interested in GSA for compartmental stochastic models. Our methodology relies on a deterministic representation of continuous time Markov chains stochastic compartmental models 27.
8.5 Model calibration and statistical inference
8.5.1 Bayesian calibration
Participants: Adama Barry, Clémentine Prieur.
Physicallybased avalanche propagation models must still be locally calibrated to provide robust predictions, e.g. in longterm forecasting and subsequent risk assessment. Friction parameters cannot be measured directly and need to be estimated from observations. Rich and diverse data is now increasingly available from testsites, but for measurements made along flow propagation, potential autocorrelation should be explicitly accounted for. In the context of María Belén Heredia's PhD, in collaboration with IRSTEA Grenoble, we have proposed in 55 a comprehensive Bayesian calibration and statistical model selection framework with application to an avalanche sliding block model with the standard Voellmy friction law and high rate photogrammetric images. An avalanche released at the Lautaret testsite and a synthetic data set based on the avalanche were used to test the approach. Results have demonstrated i) the efficiency of the proposed calibration scheme, and ii) that including autocorrelation in the statistical modelling definitely improves the accuracy of both parameter estimation and velocity predictions. In the context of the energy transition, wind power generation is developing rapidly in France and worldwide. Research and innovation on wind resource characterisation, turbin control, coupled mechanical modelling of wind systems or technological development of offshore wind turbines floaters are current research topics. In particular, the monitoring and the maintenance of wind turbine is becoming a major issue. Current solutions do not take full advantage of the large amount of data provided by sensors placed on modern wind turbines in production. These data could be advantageously used in order to refine the predictions of production, the life of the structure, the control strategies and the planning of maintenance. In this context, it is interesting to optimally combine production data and numerical models in order to obtain highly reliable models of wind turbines. This process is of interest to many industrial and academic groups and is known in many fields of the industry, including the wind industry, as "digital twin”. The objective of Adrien Hirvoas's PhD work was to develop of data assimilation methodology to build the "digital twin" of an onshore wind turbine. Based on measurements, the data assimilation should allow to reduce the uncertainties of the physical parameters of the numerical model developed during the design phase to obtain a highly reliable model. Various ensemble data assimilation approches are currently under consideration to address the problem. In the context of this work, it is necessary to develop algorithms of identification quantifying and ranking all the uncertainty sources. This work was done in collaboration with IFPEN 58, 57. The work on model calibration is still ongoing in our team, through the PhD thesis of Adama Barry. We are finishing a work on designs for physical and numerical calibration in the framework of calibration of costly to evaluate numerical models.
8.5.2 Simulation & Estimation of EPIdemics with Algorithms
Participants: Clémentine Prieur.
Due to the sanitary context, Clémentine Prieur decided to join a working group, SEEPIA Simulation & Estimation of EPIdemics with Algorithms, animated by Didier Georges (Gipsalab). A first work has been published 53. An extension of the classical pandemic SIRD model was considered for the regional spread of COVID19 in France under lockdown strategies. This compartment model divides the infected and the recovered individuals into undetected and detected compartments respectively. By fitting the extended model to the real detected data during the lockdown, an optimization algorithm was used to derive the optimal parameters, the initial condition and the epidemics start date of regions in France. Considering all the age classes together, a network model of the pandemic transport between regions in France was presented on the basis of the regional extended model and was simulated to reveal the transport effect of COVID19 pandemic after lockdown. Using the the measured values of displacement of people mobilizing between each city, the pandemic network of all cities in France was simulated by using the same model and method as the pandemic network of regions. Finally, a discussion on an integrodifferential equation was given and a new model for the network pandemic model of each age was provided. As already mentioned in Section 8.2, Clémentine Prieur went on working on the pandemic, in collaboration with Didier Georges (GIPSA Lab, Grenoble). Both of them supervised the internship of Matthieu Oliver, submitting a work proposing timedependent reduced order modeling for shortterm forecast from a spatialized SIR model 69. Robin Vaudry started in October 2021 a PhD, funded by the CNR research platform MODCOV19, and cosupervised by Clémentine Prieur and Didier Georges. The objective of this PhD is to solve inverse problems related to spatialized and ages structured commpartmental models of COVID19 pandemic.
8.5.3 Parameter estimation under uncertainties
Participants: Exaucé Luweh Adjim Ngarti, Arthur Vidard, Elise Arnaud, Victor Trappler.
Estimating key parameters in numerical models is a crucial aspect of numerical simulation, particularly when some parameters are not directly observable. Traditional estimation methods infer these parameters indirectly from their effects on observable variables, thus introducing uncertainties. In addition to the parameters to be estimated, there are often uncertain and uncontrollable nuisance parameters in the numerical model. Although variational inference is effective in ideal scenarios, its application in the context of nuisance parameters is less explored. Not accounting for the stochastic nature of these nuisance parameters leads to suboptimal estimation of said parameters due to compensation of errors. These nuisance parameters can be modeled as a random variable, and therefore the numerical model is redefined as a random variable. We formulate the problem in a Bayesian framework where we aim at estimating the posterior distribution minimizing the Kullback Leibler distance over a family of parametrized distributions. To do so, we integrate generative neural networks such as normalizing flows to enhance the expressiveness of this family. We apply these methods to the shallow water model, aiming at estimating the friction coefficient, a key parameter in coastal regions that defines the roughness of the seabed. The nuisance parameters represent the boundary forcing due to tidal frequencies and amplitudes.
8.6 High performance computing
8.6.1 Dynamic computeresource utilization
Participants: Martin Schreiber.
The way how applications are executed on supercomputers still follows a traditional static resource allocation pattern: Computing resources are allocated at the start of a job which executes the application and are only released at the end of the job's runtime. This still follows the way of running jobs since decades where a dynamic resource allocation over the application's runtime would lead to several benefits: higher utilization of the computing resources, adhoc allocation of AI accelerator cards, less energy consumption, faster response for interactive jobs, improved data locality and I/O over the full runtime, support of urgent computing without necessarily killing running jobs, etc. Various attempts have been conducted under different terminologies used such as “evolving” jobs (applicationdriven dynamic resource changes) and “malleability” (systemdriven dynamic resource changes) where we see a hybridization of them required for reaching optimal results.
We currently investigate possibilities to extend the MPI parallel programming model (which is the defacto standard in HPC) and also models beyond MPI with such interfaces. As part of that, we participate in regular MPI Session working group meetings where certain effort has been done to investigate such interfaces. Based on previous work 49, 59 we worked on “A Case Study on PMIxUsage for Dynamic Resource Management interface to a resource manager” 13.
External collaborators: Daniel Holmes, Dominik Huber, Howard Pritchard, Martin Schulz,
8.6.2 Hardwareaware numerics
Participants: Martin Schreiber.
We made further progress toward a DomainSpecific Language (DSL) for the CROCO ocean simulation model to close the increasing gap between the numerics and HPC with a separation of concerns. Such a DSL would allow applied mathematicians to express their model equations in a highlevel language and HPC experts to develop tools to automatically transform this into highly performing code on the target programming model and corresponding architecture. For our DSL, we work with PSyclone, a code generation and transformation system for Fortranbased developments, and we would like to gratefully acknowledge various developments (partly unpublished) of PSyclone developers that accelerated our success. Progress has been made on three pillars:
The first pillar is a proofofconcept on directly using PSyclone in the support of GPUbased architectures: Using the PSyclone NEMO API and our extensions, we can parse, analyze, and automatically insert OpenACC statements in the dynamical core of the CROCO ocean model. With this, we could reproduce a formerly handported and highly optimized OpenACC version with a similar performance, showing the potential to develop a DSL around this.
As a second pillar, we also started with a proofofconcept to automatically insert code statements implementing MPI communication. This prototype analyzes deep data dependencies to determine communication dependencies and implements the required halo exchanges using asynchronous MPI operations, including indepth studies of this (see Master's thesis of Anna Mittermair 2023_MA_anna_mittermair.pdf).
The third pillar is on the data assimilation requiring the computation of adjoints of time steps. Realizing this by hand is a nontrivial process and requires maintenance by an expert every time the numerics are updated. We successfully investigated using PSyclone for automatically generating code for adjoints as a proofofconcept for simple ODE examples psycloneautodiff.readthedocs.io.
These three pillars will form the fundament for the next steps in developing a DSL for the Croco ocean model which will be embedded into Fortran.
External collaborators: Rupert Ford, Anna Mittermair, Andrew Porter, Sergi Siso
8.6.3 New timeintegration methods
Participants: Martin Schreiber, Rishabh Bhatt, Laurent Debreu, Arthur Vidard.
We investigate new timeintegration methods by considering HPC requirements, targeting a better wallclock time vs. error ratio. These numerical methods include exponential, semiLagrangian, and parallelintime integration methods. Our team and collaborators have made significant progress over the last years. However, publishable results have not yet been generated or are currently in production.
As part of a collaboration with the University of São Paulo (USP), we investigated exponential integration methods based on Faber polynomials with a recently accepted publication 74. Although this paper mainly focuses on seismic wave propagation, these results can also be applied to the linearized fast modes in atmospheric models, e.g., as part of timesplitting methods.
In previous work, we have been able to show significant speedups using the Parallel Full Approximation Scheme in Space and Time (PFASST) parallelintime method for the shallowwater equations on the rotating sphere with an IMEXSDC scheme using benchmarks which are used for the development of dynamical cores 54. Motivated by this work, we investigated this in collaboration with USP using the MGRIT parallelintime method. We developed a novel filtering method specific to each multilevel; see 31. It allows running the shallowwater equations on the rotating sphere without artificial viscosity on the target level and with more than two levels, which was considered a significant challenge.
On a related topic, Rishabh Bhatt successfully defended his PhD in December 2023 20, his research focused on the application of time parallelization algorithms to enhance variational data assimilation techniques. In this framework, each iteration of the optimization algorithm involves the integration of the direct model through a timeparallel method, specifically employing the Parareal algorithm 65. The optimization process utilizes a modified version of the inexact conjugate gradient method 51, where matrixvector multiplications are facilitated through Parareal, introducing a level of inexactness. The convergence conditions of the inexact conjugate gradient method enable adaptive use of Parareal by monitoring errors in matrixvector products, maintaining accuracy levels comparable to those achieved with the conventional conjugate gradient method. To ensure practical implementation, computationally challenging norms are replaced with easily calculable approximations. The efficacy of this approach was validated by its application to both one and twodimensional shallow water models. For the more intricate twodimensional model, a Krylovenhanced subspace Parareal version was employed, demonstrating accelerated convergence and reduced iteration counts.
External collaborators: Pedro S. Peixoto, João C. Steinstraesser
9 Bilateral contracts and grants with industry
9.1 Bilateral contracts with industry
 ATOS. In the context of the French recovery plan, a collaboration with the ATOS group was initiated in late 2021 and concluded in late 2023. This collaboration enabled a total of 5 people to work fulltime on topics related to the application of machine learning methods for oceanatmosphere problems.
 A 5year contract (started in January 2020) 19CP07 with the Oceanographic and Hydrographic Service of the French Navy (SHOM) on the topic « Analyse numérique pour la réconciliation en espace et en temps des discrétisations des échanges airmer et leur paramétrisation. Application à des cas simplifiés et réalistes couplés.» (PI: F. Lemarié).
 Consortium CIROQUO – Consortium Industrie Recherche pour l’Optimisation et la QUantification d’incertitude pour les données Onéreuses – gathers academical and technological partners to work on problems related to the exploitation of numerical simulators. This Consortium, created in January 2021, is the continuation of the projects DICE, ReDICE and OQUAIDO which respectively covered the periods 20062009, 20112015 and 20152020. This consortium currently funds the postdoc of Valentin Breaz.
9.2 Bilateral grants with industry
 Funding of Clément Duhamel’s PhD by IFP Energies Nouvelles (IFPEN) within the framework of IFPEN and Inria strategic partnership. PhD subject: Robust inversion of a computer code under functional uncertainties. Application to the design of floating wind turbines.
 Funding of Exaucé Luweh Adjim Ngarti’s PhD with a CIFRE contract with Eviden. PdD subject : Deep learning for inverse problem in geophysics
10 Partnerships and cooperations
10.1 International initiatives
10.1.1 Visits of international scientists
Other international visits to the team
Dario Martinez Martinez

Status
PhD

Institution of origin:
Universidad de CastillaLa Mancha

Country:
Spain

Dates:
April 15  July 15

Context of the visit:
collaboration on domain decomposition methods

Mobility program/type of mobility:
research stay
Matthew Li

Status
Postdoc

Institution of origin:
MIT

Country:
USA

Dates:
2026 April 2023

Context of the visit:
Scientific collaboration

Mobility program/type of mobility:
research stay
Ricardo Baptista

Status
Postdoc

Institution of origin:
Caltech

Country:
USA

Dates:
911 June 2023

Context of the visit:
Scientific collaboration

Mobility program/type of mobility:
research stay
Tiangang Cui

Status
Associate professor

Institution of origin:
Monash

Country:
Australia

Dates:
19 June to 5 August 2023

Context of the visit:
Scientific collaboration

Mobility program/type of mobility:
research stay
Felipe Uribe

Status
Postdoc

Institution of origin:
LUT

Country:
Finland

Dates:
14 October to 2 December 2023

Context of the visit:
Scientific collaboration

Mobility program/type of mobility:
research stay
Felipe Uribe

Status
PhD student

Institution of origin:
DTU

Country:
Danemark

Dates:
6 to 12 November 2023

Context of the visit:
Scientific collaboration

Mobility program/type of mobility:
research stay
Arthur Campos

Status
PhD

Institution of origin:
University of São Paulo

Country:
Brazil

Dates:
20230901  20240731

Context of the visit:
Collaboration on “Multigrid algorithms for the solution of Geophysical Inverse Problems”.

Mobility program/type of mobility:
research stay
Sergi Siso

Status
Researcher

Institution of origin:
Science and Technology Facilities Council (STFC)

Country:
United Kingdom

Dates:
20230106  20230110

Context of the visit:
Collaboration with PSyclone and CROCO

Mobility program/type of mobility:
Invited research stay
Hanne Christiaensen

Status
Master student

Institution of origin:
KU Leuven

Country:
Belgium

Dates:
20230106  20230110

Context of the visit:
Collaboration on “Parareal with a neural network as coarse propagator for hyperbolic systems”

Mobility program/type of mobility:
Invited research stay
10.2 European initiatives
10.2.1 Other european programs/initiatives
EuroHPC
 Martin Schreiber is a member of the TIMEX EuroHPC012019 project where he's leading the research on runtimevarying computing resources to optimize the energy and resource efficiency of iterative parallelintime methods: timex.eu.

 Program: C3S2
 Project acronym: ERGO2
 Project title: Advancing ocean data assimilation methodology for climate applications
 Duration: August 2022  july 2025
 Coordinator: Arthur Vidard
 Other partners: Cerfacs (France), CNR (Italy)
 Abstract: The scope of this contract is to improve ocean data assimilation capabilities at ECMWF, used in both initialization of seasonal forecasts and generation of coupled Earth System reanalyses. In particular it shall focus on i) improving ensemble capabilities in NEMO and NEMOVAR and the use of their information to represent background error statistics; ii) extend NEMOVAR capabilities to allow for multiple resolution in multiincremental 3DVar; iii) make better use of ocean surface observations. It shall also involve performing scout experiments and providing relevant diagnostics to evaluate the benefit coming from the proposed developments.
10.2.2 Collaborations with Major European Organizations
 Partner: SAMO board
 SAMO board is in charge of the organization of the SAMO (sensitivity analysis of model outputs) conferences, every three years. It is strongly supported by the Joint Research Center of the European Commission. Clémentine Prieur is the current chair of SAMOboard Executive Committee and a member of its Scientific Committee.
10.3 National initiatives
10.3.1 ANR
 A 4year contract: ANR MOTIONS (Multiscale Oceanic simulaTIONS based on mesh refinement strategies with local adaptation of dynamics and physics). PI: F. Lemarié (Jan. 2024  Dec. 2027). Other partners: Laboratoire d’Aérologie, UMR 5560 (LAERO), Service Hydrographique et Océanographique de la Marine (SHOM), Institut Camille Jordan, UMR5208 (ICJ), and Laboratoire d’Etudes en Géophysique et Océanographie Spatiales, UMR5566 (LEGOS). The MOTIONS project aims at delivering robust and efficient numerical algorithms allowing an innovative multiscale modeling strategy based on blockstructured mesh refinement with local adaptation of model equations, numerics and physics in selected areas of interest. The target application to evaluate numerical developments is the simulation of important finescale nonhydrostatic processes and their feedback to larger scales within the Mediterranean / NorthEast Atlantic dynamical continuum.
 A 4year contract: ANR PLUME (Observation and Parameterization of Oceanic Convection). PI: B. Deremble (CNRS), Inria PI : F. Lemarié. The objectives of the project are threefold: (1) build a consistent database of convective events (both in the lab and with a numerical model) in order to calibrate free parameters of parameterizations of deep convection. (2) characterize the structure of the thermal plumes in a well defined parameter space that characterizes the rotating/non rotating state and forced vs free convection (3) use a data driven approach to formulate a model of convection without any preconceived bias about the mathematical formulation.
 A 4year contract: ANR ADOM (Asynchronous Domain decomposition methods) anr.fr/ProjetANR18CE460008
 A 5year contract: ANR MELODY (Bridging geophysics and MachinE Learning for the modeling, simulation and reconstruction of Ocean DYnamic) anr.fr/ProjetANR19CE460011
 A 5year contract with the French Navy (SHOM) on the improvment of the CROCO ocean model www.crocoocean.org.


Title:
MEDIATION. Methodological developments for a robust and efficient digital twin of the ocean.

Duration:
20222027

Funding:
French priority research program (PPR) ”Ocean and Climate"

Partners:
 Inria (DATAMOVE and ODYSSEY teams)
 CNRS
 IFREMER
 IMTAtlantique
 IRD
 MetéoFrance
 SHOM
 Univ. Grenoble Alpes
 Univ. AixMarseille.

Inria contact:
Laurent Debreu

Coordinator:
Laurent Debreu

Summary:
The MEDIATION project targets two questions: how will global change impact the functioning of regional marine ecosystems, and how to evaluate the effect of measures to preserve the environment? With two main demonstrators on the French coasts (Atlantic and Mediterranean), MEDIATION combines methodological developments in numerical sciences (taking into account uncertainties, high performance computing and artificial intelligence) with advances in the modeling of physical, biogeochemical and biological processes in the ocean. It aims at setting up a modeling chain, integrating data and allowing to significantly increase the number of scenarios (climate change, human activities) evaluated. The digital tools developed will also contribute to a better sciencesocietypolicy interaction

Title:
10.3.2 Inria Challenge
 Sea Uncertainty Representation and Forecast (SURF),
 Coord : Airsea (A. Vidard),
 Partenaires Inria : Ange, Cardamom, Fluminance, Lemon, Mingus, Defi
 Partenaires extérieurs: BRGM, Ifremer, SHOM
10.3.3 Other Initiatives
 A 3year contract (started in Sept 2021) funded by the Institut des Mathématiques pour la Planète Terre (IMPT) on the topic "Modélisation cohérente des échelles sousmaille pour les modèles océaniques de climat" (PI: F. Lemarié).
 E. Blayo is coadvising the PhD thesis of Valentin BelleminLaponnaz with IGE Lab, in the framework of the NASACNES working group on the SWOT satellite.
 Clémentine Prieur is currently a member of the Executive Committee of GdR MASCOT NUM (GDR3179, funded by INSMI @ CNRS) which she chaired during the period 20102017.
11 Dissemination
11.1 Promoting scientific activities
11.1.1 Scientific events: organisation
 Martin Schreiber organized the “1st EuroHPC Workshop on Malleability in HPC”
 Martin Schreiber organized the “1st EuroHPC malleability hackathon” in Grenoble
 The AIRSEA team organized the PDEs on the sphere workshop in Grenoble.
 Martin Schreiber organized with others the Minisymposium on “Performance and RealWorld Applications” at the SIAM CSE 23 in Amsterdam, NL
 E. Blayo organized a oneweek school Traitement des Données Massives et Apprentissage in Grenoble, June 59.
 Clémentine Prieur is the general chair of next SAMO conference (to be held in Genoble in 2025).
11.1.2 Scientific events: selection
Member of the conference program committees
 Martin Schreiber, International Conference on Computational Science (ICCS)
 Martin Schreiber, EuroPar Conference (EUROPAR)
 Martin Schreiber, Workshop on “Malleability Techniques Applications in HighPerformance Computing” (HPCMALL)
 Martin Schreiber, International Workshop on OpenCL (IWOCL)
 Clémentine Prieur is a member of the scientific committee of the next international MASCOT NUM conference (organized by INRIA SophiaAntipolis).
 Clémentine Prieur is a member of the scientific committee of the next JDS (Journées de Statistique organized by SfdS).
Reviewer
 Martin Schreiber: International Conference on Computational Science 2023 (ICCS)
 Martin Schreiber: Workshop on “Malleability Techniques Applications in HighPerformance Computing” 2023 (HPCMALL)
 Martin Schreiber: International Workshop on OpenCL 2023 (IWOCL)
 Martin Schreiber: Supercomputing 2023 (IWOCL)
11.1.3 Journal
Member of the editorial boards
 F. Lemarié is associate editor of the Journal of Advances in Modeling Earth Systems (JAMES)
 Clémentine Prieur is associate editor for Computational and Applied Mathematics journal.
 Clémentine Prieur is associate editor for SIAM/ASA Journal of Uncertainty Quantification journal.
 Clémentine Prieur is a member of the reading committee of Annales Mathématiques Blaise Pascal.
Reviewer  reviewing activities
 Martin Schreiber: Journal of Computational Physics (JCP).
 Olivier Zahm: Statistics and Computing, Mathematics of Computation, Advances in Computational Mathematics, Bernoulli, Multiscale Modeling and Simulation.
 Other permanent members of the Airsea team are also regular reviewers for numerous international journals.
11.1.4 Invited talks
 E. Blayo was invited as plenary speaker for the “Journées Scientifiques Inria 2023”, Bordeaux, August 30  September 1.
 Clémentine Prieur was invited to give a talk at Workshop Prévisibilité dans les sciences de l'atmosphère, les océans ou le climat of GdR défis théoriques pour les sciences du climat, October 23, 2023, IHP Paris.
 Clémentine Prieur was invited to give a series of talks at virtual seminar on sensitivity analysis (coorganized by H. Moore, Univ. of Florida and R. Smith, North Carolina State Univ.). Her first talk was given in Dec. 2023 and the next one is already planed for February, 2024.
 Clémentine Prieur was invited to give a talk at Bocconi University (Milano, Italy) in November, 2023.
 Martin Schreiber: “Time integration methods of climate and weather simulations: Of fairy tales and codesign”, University of Colorado Boulder, Computer Science Seminar
 Martin Schreiber: “Updates on MPI sessions & new dynamic processes”, International Supercomputing (ISC) 2023, Hambourg, GER (talk in BoF)
11.1.5 Leadership within the scientific community
 L. Debreu is the chair of the CNRSINSU research program LEFEMANU on mathematical and numerical methods for ocean and atmosphere programmes.insu.cnrs.fr/lefe/cs_actions/manu/ since April 2018.
 F. Lemarié is the coleader with Mike Bell (UK Met Office) and Sybille Téchené (CNRS) of the NEMO (www.nemoocean.eu) Working Group on numerical kernel development.
 M. Schreiber is a coorganizer of Birds of feather “Enabling I/O and Computation Malleability in HighPerformance Computing” at Supercomputing 2023, Denver, USA
11.1.6 Scientific expertise
 L. Debreu is a member of the steering committee of the CROCO ocean model www.crocoocean.org
 L. Debreu is a member of the evaluation committee of the axis Interfaces: mathematics, numerical sciences  earth and and environmental sciences of the French National Research Agency.
 Martin Schreiber is a member of the MPI Forum (representing the Université Grenoble Alpes) and in particular active in the MPI Session workgroup.
 Martin Schreiber is a member of the OpenMP ARB (representing Inria).
 As part of the TIMEX project, Martin Schreiber is responsible for organizing and determining possible collaboration topics with other EuroHPC19 projects.
 Martin Schreiber was a reviewer in “SIGAP CNRS”
 E. Blayo is a member of the scientific committee of IMPT (Institut Mathématique pour la Planète Terre)
 Clémentine Prieur is advisor to the scientific council of IFPEN.
11.1.7 Research administration
 E. Blayo is a deputy director of the Jean Kuntzmann Lab.
 Clémentine Prieur is local correspondant in Grenoble for the Mathematical Society of France (SMF).
 Clémentine Prieur is a member of the MSTIC pole council of UGA.
 Clémentine Prieur is responsible for the applied math speciality for doctoral school MSTII edmstii.univgrenoblealpes.fr
11.2 Teaching  Supervision  Juries
11.2.1 Teaching
 Licence: E. Blayo, analysis and algebra, 145h, L1, University Grenoble Alpes, France
 Licence: E. Arnaud, Mathematics for engineers, 50h, L2, UGA, France
 Licence: E. Arnaud, Statistics, 20h, L2, UGA, France
 Licence: Martin Schreiber, Advanced Analysis & Algebra, L1, 69h, UGA, France
 Licence: C. Kazantsev, Mathématiques approfondies pour l'ingénieur, 36h, L2, UGA, France
 Licence: C. Kazantsev, Mathématiques pour les sciences de l'ingénieur, 36h, L2, UGA, France
 Master: E. Blayo, Partial Differential Equations, 24h, M1, University Grenoble Alpes, France
 Master: E. Blayo, Optimal control of PDEs, 15h, M2, University Grenoble Alpes, France
 Master: E. Arnaud, Statistics, 30h, M1, UGA, France
 Master: E. Arnaud, Supervision of student in apprenticeship, 30h, M2, UGA, France
 Master: Martin Schreiber, HighPerformance Computing, M1, 31.1h, UGA, France
 Master: Martin Schreiber, Parallel Algorithms and Programming, M1, 11.25h, UGA, France
 Master: Martin Schreiber, Object oriented programming with C++, M1, 18h, UGA, France
 Master: Martin Schreiber, Partial differential equations, M1, 34.5h, UGA, France
 Elearning: E. Arnaud is in charge of the pedagogic platform math@uga : implementation of a collaborative moodle platform to share pedagogical resources within teachers and towards students.
 E. Blayo is in charge of the Ecole des Mathématiques Appliquées: organization and coordination of pedagogical and administrative aspects related to teaching for the applied maths department.
11.2.2 Supervision
 Master 1 internship, Julien Remy (“Automatic differentiation with PSyclone”, supervision by M. Schreiber, A. Vidard and M. Brémond)
 PhD in progress: Pierre Lozano, Coupling hydrostatic and nonhydrostatic ocean circulation models. October 2022, E. Blayo and L. Debreu
 PhD in progress: Benjamin Zanger, Compositional surrogates for reduced order modeling, Septembre 2022, M. Schreiber, O. Zahm
 PhD in progress: Dominik Huber, “Dynamic Resource Management with MPI Sessions and PMIx” (preliminary title), PhD candidate at the Technical University of Munich (TUM), M. Schulz (TUM), M. Schreiber
 PhD in progress: Fernando V. Ravelo, PhD candidate at the University of Sao Paulo (USP), P. Peixoto (USP), M. Schreiber
 PhD in progress: Gabriel Derrida, Design of flexible and numericallysound generalised vertical coordinates with vertical ALE (VALE) algorithm for operational ocean forecasting. October 2023, L. Debreu and F. Lemarié.
 PhD in progress: Manolis Perrot, Consistent subgrid scale modeling for oceanic climate models. October 2021, E. Blayo and F. Lemarié.
 PhD in progress: Katarina Radisic, Prise en compte d'incertitudes externes dans l'estimation de paramètres d'un modèle de transfert d'eau et de pesticides à l'échelle du bassin versant, Université GrenobleAlpes December 2021, C. Lauvernet (Inrae) and A. Vidard.
 PhD in progress: Exaucé Luweh Adjim Ngarti, deep learning for inverse problem in geophysics, Université GrenobleAlpes Avril 2023, E. Arnaud, L. Nicoletti (Atos) and A. Vidard.
 PhD in progress: Hélène Hénon, Assimilation de données variationnelles multifidélité pour les prévisions océaniques , Université GrenobleAlpes Octobre 2023, A. Vidard.
 PhD in progress : Adama Barry, Plans d'expériences pour la calibration et la validation d'un simulateur numérique, January 2022, F. Bachoc (Institut de Mathématiques de Toulouse), C. Prieur, promoted by M. Munoz Zuniga and S. Bouquet.
 PhD in progress : Ri Wang, Apprentissage statistique pour l'analyse de sensibilité globale avec entrées dépendantes, October 2021, V. MaumeDeschamps (Université Lyon 1), C. Prieur.
 PhD in progress: Robin Vaudry, Analyse de Sensibilité, quantification d'incerTitudes et calibRAtion pour des modèles épidémioloGiques structurés, October 2021, D. Georges (Grenoble INP), C. Prieur.
 PhD in progress: Clément Duhamel, Inversion robuste d'un code de calcul prenant en entrées des données de nature fonctionnelle. Application à la conception d'éoliennes, October 2020, C. Helbert (Ecole Centrale Lyon), C. Prieur, promoted by M. Munoz Zuniga, D. Sinoquet (IFPEN)
 PhD in progress: Romain Verdière, Nonlinear dimension reduction for uncertainty quantification problems, Septembre 2022, O. Zahm
 PhD in progress: Angélique Saillet, Design of experiments, climate scenarios and ocean simulation, Octobre 2023, Eric Blayo and Élise Arnaud.d C. Prieur
 PhD in progress: Qiao Chen, Lowdimensional structure of ocean data assimilation problems via gradient information, October 2021, É. Arnaud and O. Zahm
 PhD in progress: Rafael Flock, Certified coordinate selection for highdimensional imaging problems, October 2021, and Y. Dong O. Zahm
 PhD in progress: Keerthi Gaddameedi, “Dynamic Resource Management for ParallelInTime Simulations” (preliminary title), PhD candidate at the Technical University of Munich (TUM), H.J. Bungartz (TUM), T. Neckel (TUM), M. Schreiber
 Postdoctoral student: Alexis Anagnostakis, Kernelbased Sensitivity Analysis for Hypoelliptic Systems, P. Etoré (LJK, DATA) and C. Prieur.
 Postdoctoral student: Valentin Breaz, SemiSupervised dimension reduction, M. MunozZuniga (IFPEN) and O. Zahm.
 PhD: Emilie Rouziès, Quantification et réduction de l'incertitude dans un modèle de transfert de pesticides à l'échelle du bassin versant, Université GrenobleAlpes, 23 février 2023, C. Lauvernet (Inrae) and A. Vidard.
 PhD: Rishabh Bhatt, Algorithmes parallèles en temps pour l'assimilation de données, Université GrenobleAlpes, 23 Novembre 2023, L. Debreu and A. Vidard
11.2.3 Juries
 F. Lemarié: May 17, 2023 Jury member for the recruitment of a lecturer (MCF, CNU26) at the Ecole Centrale de Lyon on the topic Modeling for adaptation and/or transitions and/or the environment, Numerical Analysis, PDEs
 Arthur Vidard: Dec. 11, 2023, PhD thesis of Jules Guillot, Univ. Bretagne Sud de, "Quantification des incertitudes en assimilation de données", (reviewer).
 E. Blayo:
 Sept. 29, 2023: PhD thesis of Benjamin Dufée, Univ. Rennes 1 (reviewer)
 Dec. 7, 2023: PhD thesis of Eliot Jager, Univ. Grenoble Alpes (president)
 Dec. 8, 2023: PhD thesis of Youness ElOuartassy, Univ. Toulouse III (reviewer)
 Dec. 21, 2023: PhD thesis of Juliette Dubois, Sorbonne University (reviewer)
 Martin Schreiber:
 Jun. 12, 2023: PhD thesis of Nicolas Monnier “ExaSKA: Parallelization on a High Performance Computing server for the exascale radiotelescope SKA”, Université ParisSaclay (reviewer)
 Nov. 23, 2023: PhD thesis of Rishabh Bhatt “Parallel In Time Algorithms for Data Assimilation”, University Grenoble Alpes (president)
11.3 Popularization
 Clémentine Prieur : Intervention au Café sciences et citoyens : "Que doiton craindre des points de basculement du système climatique ?" echosciencesgrenoble.fr
 Elise Arnaud: intervention au collège Simone de Beauvoir en 3eme dans le cadre de la semaine du numérique et des sciences informatiques
11.3.1 Internal or external Inria responsibilities
 E. Arnaud: member of Inria PhD recruitment committee CORDIS
 E. Arnaud: in charge of the parity commission at Jean Kuntzmann Lab
 Martin Schreiber represents Inria in the OpenMP ARB.
11.3.2 Articles and contents
 Article “Zeitintegration: Hochskalierbare rationale Approximation von exponentiellen Integratoren” of M. Schreiber Quartl_106.pdf is published in the “Quartl” which is a journal for scientific outreach.
11.3.3 Education
 Martin Schreiber participated in an activity for students of the Colorado University of Boulder by providing answers to 35 questions related to novel time integration methods for weather and climate simulations.
 Ch. Kazantsev and E. Blayo are strongly involved in the creation and dissemination of pedagogic suitcases with mathematical activities designed for primary and secondary schools, as well as an escape game.
 C. Kazantsev is a member of an IREM group for creation of scientific activities for professional development of secondary schools teachers.
 C. Kazantsev is a member of an International InterIREM commission, which work on the multilanguages problem for children in the mathematical learning.
11.3.4 Creation of media or tools for science outreach
 C.Kazantsev participated in the edition of the Teachers notebooks which explain and advise how to use the "La Grange Suitcases" (sets of mathematical games, problems and animations) destined for primary and secondary schools teachers as well as for the general public.
 C.Kazantsev participated in the creation of mathematical activities that can be autonomously used by schoolchildren of primary and secondary schools and by the general public.
 C.Kazantsev participated in the creation of a mathematical serious game that can be played by schoolchildren of secondary schools and by the general public. See www.lagrangedesmaths.fr/missionexoplanetes/
11.3.5 Interventions
 E. Blayo gave several outreach talks, in particular for high school students, and for more general audiences.
12 Scientific production
12.1 Publications of the year
International journals
 1 articleOn the representation and learning of monotone triangular transport maps.Foundations of Computational MathematicsNovember 2023HALDOIback to text
 2 articleTotal Effects with Constrained Features.Statistics and Computing2024, 129HALback to text
 3 articleScalable conditional deep inverse Rosenblatt transports using tensor trains and gradientbased dimension reduction.Journal of Computational PhysicsJuly 2023, 112103HALDOIback to text
 4 articleA SUR version of the Bichon criterion for excursion set estimation.Statistics and Computing332April 2023, 41HALDOIback to text
 5 articleFeasible Set Estimation under Functional Uncertainty by Gaussian Process Modelling.Physica D: Nonlinear Phenomena455December 2023, 133893HALDOI
 6 articleA probabilistic point of view for the Kolmogorov hypoelliptic equations.ESAIM: Probability and Statistics27July 2023, 668693HALDOIback to text
 7 articleSliding or stumbling on the staircase: numerics of ocean circulation along piecewiseconstant coastlines.Journal of Advances in Modeling Earth Systems155May 2023, e2022MS003594HALDOIback to text
 8 articleGlobal sensitivity analysis of the dynamics of a distributed hydrological model at the catchment scale.SocioEnvironmental Systems Modelling2023, 117HALDOIback to text
 9 articleHow is a global sensitivity analysis of a catchmentscale, distributed pesticide transfer model performed? Application to the PESHMELBA model.Geoscientific Model Development1611June 2023, 31373163HALDOIback to text
International peerreviewed conferences
 10 inproceedingsDiscrete analysis of Schwarz Waveform Relaxation for a simplified airsea coupling problem with nonlinear transmission conditions.Domain Decomposition Methods in Science and Engineering XXVI. Lecture Notes in Computational Science and Engineering,26th International Domain Decomposition Conference145Hong Kong, ChinaSpringer ChamApril 2023, 189196HALDOI
 11 inproceedingsSemidiscrete analysis of a simplified airsea coupling problem with nonlinear coupling conditions.27th International Domain Decomposition ConferencePrague, Czech Republic2023HALback to text
 12 inproceedingsTowards a finite volume discretization of the atmospheric surface layer consistent with physical theory.PROMS  Springer Proceedings in Mathematics & StatisticsFVCA 2023  Finite Volumes for Complex Applications 10432Strasbourg, France2023, 18HALDOIback to text
 13 inproceedingsA Case Study on PMIxUsage for Dynamic Resource Management.High Performance 2023  38th International Conference, ISC High Performance13999Lecture Notes in Computer ScienceHambourg, GermanySpringer Nature SwitzerlandAugust 2023, 4255HALDOIback to text
 14 inproceedingsOvercoming Weak Scaling Challenges in TreeBased Nearest Neighbor Time Series Mining.38th International Conference, ISC High Performance 202313948Lecture Notes in Computer ScienceHambourg, GermanySpringer Nature SwitzerlandMay 2023, 317338HALDOI
Conferences without proceedings
 15 inproceedingsGoal oriented random forest (GORF).Séminaire Phimeca 2023Paris, France2023HAL
 16 inproceedingsData assimilation to quantify and reduce uncertainty in ecohydrology modelling.Séminaire ITES 2023Strasbourg, France2023HAL
 17 inproceedingsCalibrating a hydrological model robustly to rain perturbations with stochastic surrogates.LEFE/MANU 2023  Colloque sur les Méthodes mathématiques et numériquesToulouse, France2023HALback to text
 18 inproceedingsRobust calibration of a hydrological model with stochastic surrogates.MASCOTNUM 2023  Workshop Méthodes d'Analyse Stochastique pour les Codes et Traitement NUMériquesLe Croisic, France2023, 12HAL
 19 inproceedingsSensitivity analysis with external stochastic forcings : application to a water and pesticide transfer model.MEXICO 2023  Rencontres annuelles du réseau MexicoPalaiseau, France2023, 136HAL
Doctoral dissertations and habilitation theses
 20 thesisParallel In Time Algorithms for Data Assimilation.Universite Grenoble AlpesNovember 2023HALback to text
 21 thesisAdvancing the representation of flows along topography in zcoordinate ocean models.Sorbonne UniversitéSeptember 2023HALback to text
Reports & preprints
 22 miscHommage à Elisabeta Vergu.November 2023HAL
 23 miscA probabilistic reduced basis method for parameterdependent problems.April 2023HALback to text
 24 miscSelfreinforced polynomial approximation methods for concentrated probability densities.March 2023HALback to text
 25 miscNew estimation of Sobol' indices using kernels.March 2023HALback to text
 26 miscCertified coordinate selection for highdimensional Bayesian inversion with Laplace prior.October 2023HALDOIback to text
 27 miscExploiting deterministic algorithms to perform global sensitivity analysis of continuoustime Markov chain compartmental models with application to epidemiology.January 2024HALback to text

28
miscPrincipal Feature Detection via
$$ Sobolev Inequalities.May 2023HALback to text  29 miscEnergetically consistent EddyDiffusivity MassFlux schemes for Atmospheric and Oceanic Convection.February 2024HAL
 30 miscA generalized rational approximation of exponential integration (rexi) for massively parallel time integration.2023HAL
 31 miscParallelintime integration of the shallow water equations on the rotating sphere using Parareal and MGRIT.June 2023HALDOIback to text
 32 miscDiffeomorphismbased feature learning using Poincaré inequalities on augmented input space.December 2023HALback to text
Other scientific publications
 33 miscConsideration of external uncertainties in the estimation of parameters of a water and pesticide transfer model at the watershed scale.February 2023HAL
 34 inproceedingsStatedependent Preconditioning for Data Assimilation.2023  54th International Liège Colloquium on Ocean Dynamics : Machine Learning and Data Analysis in OceanographyLiège, BelgiumMay 2023, 11HALback to text
 35 inproceedingsStatedependent Preconditioning for Data Assimilation: Application to a Shallow Water Assimilation system.ISDA 2023  9th International Symposium on Data AssimilationBologne, Italy2023, 11HALback to text
 36 inproceedingsThe application of a time exponential integrator to the wave equations, oriented to seismic imaging.LACIAM 2023  First Latin American Congress of Industrial and Applied MathematicsValparaíso, ChileJanuary 2023, 11HALback to text
12.2 Cited publications
 37 articleTheory and analysis of acousticgravity waves in a freesurface compressible and stratified ocean.Ocean Modelling168December 2021, 120HALDOIback to text
 38 articleGlobal sensitivity analysis with aggregated Shapley effects, application to avalanche hazard assessment.Reliability Engineering and System Safety222108420June 2022, 111HALDOIback to textback to text
 39 articleNonparametric estimation of aggregated Sobol' indices: application to a depth averaged snow avalanche model.International Journal of Reliability, Quality and Safety Engineering212August 2021, 107422:114HALDOIback to text
 40 inproceedingsStochastic methods for solving highdimensional partial differential equations.International Conference on Monte Carlo and QuasiMonte Carlo Methods in Scientific Computing  MCQMC 2018324Springer Proceedings in Mathematics & StatisticsRennes, FranceSpringerJuly 2018, 125141HALDOIback to text
 41 articleA PAC algorithm in relative precision for bandit problem with costly sampling.Mathematical Methods of Operations Research962022, 161–185HALDOIback to text
 42 articleDiscrete analysis of Schwarz waveform relaxation for a diffusion reaction problem with discontinuous coefficients.SMAI Journal of Computational Mathematics8April 2022, 99124HALDOIback to text
 43 phdthesisNumerical analysis for a combined spacetime discretization of airsea exchanges and their parameterizations.Université Grenoble AlpesNovember 2022HALback to textback to text
 44 articleBrinkman volume penalization for bathymetry in threedimensional ocean models.Ocean Modelling145January 2020, 113HALDOIback to text
 45 articleImproved Gulf Stream separation through Brinkman penalization.Ocean Modelling179November 2022, 102121HALDOIback to text
 46 articleDatadriven Stochastic Inversion via Functional Quantization.Statistics and Computing303May 2020, 525541HALDOIback to text
 47 unpublishedSet inversion under functional uncertainties with Gaussian ProcessRegression defined in the joint space of control and uncertain.2021, working paper or preprintHALback to text
 48 articleGlobal sensitivity analysis for models described by stochastic differential equations.Methodology and Computing in Applied Probability22June 2020, 803831HALDOIback to text
 49 inproceedingsAn Emulation Layer for Dynamic Resources with MPI Sessions.HPCMALL 2022  Malleability Techniques Applications in HighPerformance ComputingHambourg, GermanyJune 2022HALback to text
 50 articleSensitivity analysis for multidimensional and functional outputs.Electronic Journal of Statistics812014, 575603back to text
 51 articleMinimizing convex quadratics with variable precision conjugate gradients.Numerical Linear Algebra with Applications2812021, e2337URL: https://onlinelibrary.wiley.com/doi/abs/10.1002/nla.2337DOIback to text
 52 articlePhysicsDynamics Coupling in Weather, Climate, and Earth System Models: Challenges and Recent Progress.Monthly Weather Review14611November 2018, 35053544HALDOIback to text
 53 articleTransport effect of COVID19 pandemic in France.Annual Reviews in Control502020, 394408HALDOIback to text
 54 articleParallelintime multilevel integration of the shallowwater equations on the rotating sphere.Journal of Computational Physics4072020, 109210back to text
 55 inproceedingsCalibration of the Voellmy avalanche friction parameters using a Bayesian approach from an high rate positioning avalanche.EGU 2018  European Geosciences Union General AssemblyVienna, AustriaApril 2018HALback to text
 56 articleNumerical modeling of hydraulic control, solitary waves and primary instabilities in the Strait of Gibraltar.Ocean Modelling155July 2020, 101642HALDOIback to text
 57 articleQuantification and reduction of uncertainties in a wind turbine numerical model based on a global sensitivity analysis and a recursive Bayesian inference approach.International Journal for Numerical Methods in Engineering12210May 2021, 25282544HALDOIback to text
 58 articleWind Turbine Quantification and Reduction of Uncertainties Based on a DataDriven Data Assimilation Approach.Journal of Renewable and Sustainable Energy145September 2022, 053303HALDOIback to text
 59 inproceedingsTowards Dynamic Resource Management with MPI Sessions and PMIx.EuroMPI/USA'22: Proceedings of the 29th European MPI Users' Group MeetingEuroMPI/USA'22  29th European MPI Users' Group MeetingChattanooga, United StatesACMSeptember 2022, 5767HALDOIback to text
 60 articleShapley effects for sensitivity analysis with correlated inputs: comparisons with Sobol' indices, numerical estimation and applications.International Journal for Uncertainty Quantification952019, 493514HALDOIback to text
 61 articleMachine learning–.Proceedings of the National Academy of Sciences118212021DOIback to text
 62 articleMultivariate sensitivity analysis to measure global contribution of input factors in dynamic models.Reliability Engineering & System Safety9642011, 450459back to text
 63 articleReconciling and improving formulations for thermodynamics and conservation principles in Earth System Models (ESMs).Journal of Advances in Modeling Earth SystemsAugust 2022, 199HALDOIback to text
 64 articleA simplified atmospheric boundary layer model for an improved representation of airsea interactions in eddying oceanic models: implementation and first evaluation in NEMO (4.0).Geoscientific Model Development141January 2021, 543  572HALDOIback to text
 65 articleRésolution d'EDP par un schéma en temps «.Comptes Rendus de l'Académie des SciencesSeries I Mathematics33272001, 661668back to text
 66 article3D waveresolving simulation of sandbar migration.Ocean Modelling180December 2022, 102127HALDOIback to text
 67 articleA Schwarz iterative method to evaluate oceanatmosphere coupling schemes: implementation and diagnostics in IPSLCM6SWVLR.Geoscientific Model Development Discussions145May 2021, 2959  2975HALDOIback to text
 68 techreportDiagnosing the oceanatmosphere coupling schemes by using a mathematically consistent Schwarz iterative method.Research activities in Earth system modelling. Working Group on Numerical Experimentation. Report No. 51. WCRP Report No.4/2021. WMO, GenevaJuly 2021HALback to text
 69 articleSpatialized Epidemiological Forecasting applied to Covid19 Pandemic at Departmental Scale in France.Systems and Control Letters164June 2022, 105240HALDOIback to textback to text
 70 articleOn Shapley value for measuring importance of dependent inputs.SIAM/ASA Journal on Uncertainty Quantification511September 2017, 9861002HALDOIback to text
 71 articleSobol' indices and Shapley value.Journal on Uncertainty Quantification22014, 245251back to text
 72 articleAn effective algorithm for computing global sensitivity indices (EASI).Reliability Engineering & System Safety9542010, 354360back to text
 73 articleGlobal sensitivity measures from given data.European Journal of Operational Research22632013, 536550back to text
 74 miscAn explicit exponential time integrator based on Faber polynomials and its application to seismic wave modelling.December 2022HALback to text
 75 articleThe Future of Sensitivity Analysis: An essential discipline for systems modeling and policy support.Environmental Modelling and Software137March 2021, 104954HALDOIback to text
 76 phdthesisQuantification et réduction de l'incertitude dans un modèle de transfert de pesticides à l'échelle du bassin versant.Université Grenoble Alpes [2020....]February 2023HALback to text
 77 techreportShapley Effects for Global Sensitivity Analysis: Theory and Computation.Northwestern University2015back to text
 78 articleAnalysis of Schwarz Waveform Relaxation for the Coupled Ekman Boundary Layer Problem with Continuously Variable Coefficients.Numerical Algorithms89March 2022, 1145–1181HALDOIback to text
 79 miscDeep Learning for Image Superresolution: A Survey.2020back to text