MOISE is a research project-team in applied mathematics and scientific computing, focusing on the development of mathematical and numerical methods for direct and inverse modelling in environmental applications (mainly geophysical fluids). The scientific backdrop of this project-team is the design of complex forecasting systems, our overall applicative aim being to contribute to the improvement of such systems, especially those related to natural hazards: climate change, regional forecasting systems for the ocean and atmosphere, decision tools for floods, ...
A number of specific features are shared by these different applications: interaction of different scales, multi-component aspects, necessity of combining heterogeneous sources of information (models, measurements, images), uniqueness of each event. The development of efficient methods therefore requires to take these features into account, a goal which covers several aspects, namely:
Mathematical and numerical modelling
Data assimilation (deterministic and stochastic approaches)
Quantification of forecast uncertainties
Pluridisciplinarity is a key aspect of the project-team. The part of our work more related to applications is therefore being conducted in close collaboration with specialists from the different fields involved (geophysicists, etc).
Geophysical flows generally have a number of particularities that make it difficult to model them and that justify the development of specifically adapted mathematical and numerical methods:
Geophysical flows are non-linear. There is often a strong interaction between the different scales of the flows, and small-scale effects (smaller than mesh size) have to be modelled in the equations.
Every geophysical episode is unique: a field experiment cannot be reproduced. Therefore the validation of a model has to be carried out in several different situations, and the role of the data in this process is crucial.
Geophysical fluids are non closed systems, i.e. there are always interactions between the different components of the environment (atmosphere, ocean, continental water, etc.). Boundary terms are thus of prime importance.
Geophysical flows are often modeled with the goal of providing forecasts. This has several consequences, like the usefulness of providing corresponding error bars or the importance of designing efficient numerical algorithms to perform computations in a limited time.
Given these particularities, the overall objectives of the MOISE project-team described earlier will be addressed mainly by using the mathematical tools presented in the following.
Models allow a global view of the dynamics, consistent in time and space on a wide spectrum of scales. They are based on fluid mechanics equations and are complex since they deal with the irregular shape of domains, and include a number of specific parameterizations (for example, to account for small-scale turbulence, boundary layers, or rheological effects). Another fundamental aspect of geophysical flows is the importance of non-linearities, i.e. the strong interactions between spatial and temporal scales, and the associated cascade of energy, which of course makes their modelling more complicated.
Since the behavior of a geophysical fluid generally depends on its interactions with others (e.g. interactions between ocean, continental water, atmosphere and ice for climate modelling), building a forecasting system often requires coupling different models. Several kinds of problems can be encountered, since the models to be coupled may differ in numerous respects: time and space resolution, physics, dimensions. Depending on the problem, different types of methods can be used, which are mainly based on open and absorbing boundary conditions, multi-grid theory, domain decomposition methods, and optimal control methods.
Despite their permanent improvement, models are always characterized by an imperfect physics and some poorly known parameters (e.g. initial and boundary conditions). This is why it is important to also have observations of natural systems. However, observations provide only a partial (and sometimes very indirect) view of reality, localized in time and space.
Since models and observations taken separately do not allow for a deterministic reconstruction of real geophysical flows, it is necessary to use these heterogeneous but complementary sources of information simultaneously, by using data assimilation methods. These tools for inverse modelling are based on the mathematical theories of optimal control and stochastic filtering. Their aim is to identify system parameters which are poorly known in order to correct, in an optimal manner, the model trajectory, bringing it closer to the available observations.
Variational methods are based on the minimization of a function measuring the discrepancy between a model solution and observations, using optimal control techniques for this purpose. The model inputs are then used as control variables. The Euler Lagrange condition for optimality is satisfied by the solution of the “Optimality System" (OS) that contains the adjoint model obtained by derivation and transposition of the direct model. It is important to point out that this OS contains all the available information: model, data and statistics. The OS can therefore be considered as a generalized model. The adjoint model is a very powerful tool which can also be used for other applications, such as sensitivity studies.
Stochastic filtering is the basic tool in the sequential approach to the problem of data assimilation into numerical models, especially in meteorology and oceanography. The (unknown) initial state of the system can be conveniently modeled by a random vector, and the error of the dynamical model can be taken into account by introducing a random noise term. The goal of filtering is to obtain a good approximation of the conditional expectation of the system state (and of its error covariance matrix) given the observed data. These data appear as the realizations of a random process related to the system state and contaminated by an observation noise.
The development of data assimilation methods in the context of geophysical fluids, however, is difficult for several reasons:
the models are often strongly non-linear, whereas the theories result in optimal solutions only in the context of linear systems;
the model error statistics are generally poorly known;
the size of the model state variable is often quite large, which requires dealing with huge covariance matrices and working with very large control spaces;
data assimilation methods generally increase the computational costs of the models by one or two orders of magnitude.
Such methods are now used operationally (after 15 years of research) in the main meteorological and oceanographic centers, but tremendous development is still needed to improve the quality of the identification, to reduce their cost, and to make them available for other types of applications.
A challenge of particular interest consists in developing methods for assimilating image data. Indeed, images and sequences of images represent a large amount of data which are currently underused in numerical forecast systems. However, despite their huge informative potential, images are only used in a qualitative way by forecasters, mainly because of the lack of an appropriate methodological framework.
Due to the strong non-linearity of geophysical systems and to their chaotic behavior, the dependence of their solutions on external parameters is very complex. Understanding the relationship between model parameters and model solutions is a prerequisite to design better models as well as better parameter identification. Moreover, given the present strong development of forecast systems in geophysics, the ability to provide an estimate of the uncertainty of the forecast is of course a major issue. However, the systems under consideration are very complex, and providing such an estimation is very challenging. Several mathematical approaches are possible to address these issues, using either variational or stochastic tools.
Variational approach. In the variational framework, the sensitivity is the gradient of a response function with respect to the parameters or the inputs of the model. The adjoint techniques can therefore be used for such a purpose. If sensitivity is sought in the context of a forecasting system assimilating observations, the optimality system must be derived. This leads to the study of second-order properties: spectrum and eigenvectors of the Hessian are important information on system behavior.
Global stochastic approach. Using the variational approach to sensitivity leads to efficient computations of complex code derivatives. However, this approach to sensitivity remains local because derivatives are generally computed at specific points. The stochastic approach of uncertainty analysis aims at studying global criteria describing the global variabilities of the phenomena. For example, the Sobol sensitivity index is given by the ratio between the output variance conditionally to one input and the total output variance. The computation of such quantities leads to statistical problems. For example, the sensitivity indices have to be efficiently estimated from a few runs, using semi or non-parametric estimation techniques. The stochastic modeling of the input/output relationship is another solution.
The evolution of natural systems, in the short, mid, or long term, has extremely important consequences for both the global Earth system and humanity. Forecasting this evolution is thus a major challenge from the scientific, economic, and human viewpoints.
Humanity has to face the problem of global warming, brought on by the emission of greenhouse gases from human activities. This warming will probably cause huge changes at global and regional scales, in terms of climate, vegetation and biodiversity, with major consequences for local populations. Research has therefore been conducted over the past 15 to 20 years in an effort to model the Earth's climate and forecast its evolution in the 21st century in response to anthropic action.
With regard to short-term forecasts, the best and oldest example is of course weather forecasting. Meteorological services have been providing daily short-term forecasts for several decades which are of crucial importance for numerous human activities.
Numerous other problems can also be mentioned, like seasonal weather
forecasting (to enable powerful phenomena like an El Ni
As mentioned previously, mathematical and numerical tools are omnipresent and play a fundamental role in these areas of research. In this context, the vocation of MOISE is not to carry out numerical prediction, but to address mathematical issues raised by the development of prediction systems for these application fields, in close collaboration with geophysicists.
Keywords: Multi-resolution, Coupling Methods, Data Assimilation, Ocean, Atmosphere
Understanding and forecasting the ocean circulation is currently the subject of an intensive research effort by the international scientific community. This effort was primarily motivated by the crucial role of the ocean in determining the Earth's climate, particularly from the perspective of global change. In addition, important recent research programs are aimed at developing operational oceanography, i.e. near real-time forecasting of ocean circulation, with applications for ship routing, fisheries, weather forecasting, etc. Another related field is coastal oceanography, dealing for example with pollution, littoral planning, or the ecosystems management. Local and regional agencies are currently very interested in numerical modelling systems for coastal areas.
Both ocean-alone models and coupled ocean-atmosphere models are being developed to address these issues. In this context, the MOISE project-team conducts efforts mainly on the following topics:
Multi-resolution approaches and coupling methods: Many applications in coastal and operational oceanography require high resolution local models. These models can either be forced at their boundaries by some known data, or be dynamically coupled with a large-scale coarser resolution model. Such model interactions require specific mathematical studies on open boundary conditions, refinement methods (like mesh refinement or stochastic downscaling), and coupling algorithms. The latter have also to be studied in the context of ocean-atmosphere coupled systems.
Advanced numerical schemes: Most ocean models use simple finite difference schemes on structured grids. We are seeking for better schemes allowing both accuracy and good conservation properties, and dealing with irregular boundaries and bottom topography.
Data assimilation methods for ocean modelling systems:
The main difficulties encountered when assimilating data in ocean or atmosphere
models are the huge dimension of the model state vector (typically
Most of these studies are led in strong interaction with geophysicists, in particular from the Laboratoire des Ecoulements Géophysiques et Industriels (LEGI, Grenoble).
Keywords: Inverse Methods, Data Assimilation, Glaciology, Ice Core Dating
The study of past climate is a means of understanding climatic mechanisms. Drillings in polar ice sheets provide a huge amount of information on paleoclimates: correlation between greenhouse gases and climate, fast climatic variability during the last ice age, etc. However, in order to improve the quantitative use of the data from this archive, numerous questions remain to be answered because of phenomena occurring during and after the deposition of snow. An important research aim is therefore to optimally model ice sheets in the vicinity of drilling sites in order to improve their interpretation: age scale for the ice and for the gas bubbles, mechanical thinning, initial surface temperature and accumulation when snow is deposited, spatial origin of ice from the drilling.
In another respect, ice streams represent an important feature of ice flows since they account for most of the ice leaving the ice sheet (in Antarctic, one estimates that ice streams evacuate more than 70% of the ice mass in less than 10% of the coast line). Furthermore, recent observations showed that some important ice streams are presently accelerating. Thus, we seek to improve models of ice sheets, by developing data assimilation approaches in order to calibrate them using available observations.
Another objective is the evaluation of the state of the polar ice caps in the past, and their interactions with the other components of the earth climate, in order to forecast their evolution in the forthcoming centuries. The joint use of models and data, through data assimilation techniques, to improve system description is relatively new for the glaciological community. Therefore inverse methods have to be developed or adapted for this particular purpose.
By gaining and loosing mass, glaciers and ice-sheets are playing a key role in the sea level evolution. This is obvious when regarding past as, for example, collapse of the large northern hemisphere ice-sheets after the Last Glacial Maximum has contributed to an increase of 120 m of sea level. This is particularly worrying when the future is considered. Indeed, recent observations clearly indicate that important changes in the velocity structure of both Antarctic and Greenland ice-sheets are occurring, suggesting that large and irreversible changes may have been initiated. This has been clearly emphasized in the last report published by the Intergovernmental Panel on Climate Change (IPCC). IPCC has further insisted on the poor current knowledge of the key processes at the root of the observed accelerations and finally concluded that reliable projections of sea-level rise are currently unavailable. In this context, our general aim is to develop data assimilation methods related to ice flow modelling purpose, in order to provide accurate and reliable estimation of the future contribution of ice-sheets to Sea Level Rise.
Development of ice flow adjoint models is by itself a scientific challenge. This new step forward is clearly motivated by the amount of data now available at both the local and the large scales.
Shallow Water (SW) models are widely used for the numerical modeling of river flows. Depending on the geometry of the domain, of the flow regime, and of the level of accuracy which is required, either 1D or 2D SW models are implemented. It is thus necessary to couple 1D models with 2D models when both models are used to represent different portions of the same river. Moreover, when a river flows into the sea/ocean (e.g. the Rhône river in the Mediterranean), one may need to couple a 2D SW with a full 3D model (such as the Navier-Stokes equations) of the estuary. These issues have been widely addressed by the river-engineering community, but often with somehow crude approaches in terms of coupling algorithms. This may be improved thanks to more advanced boundary conditions, and with the use of Schwarz iterative methods for example. We tackled these issues, in the past in the framework of a partnership with the French electricity company EDF, and now thanks to another contract with ARTELIA Group.
AGRIF (Adaptive Grid Refinement In Fortran, , ) is a Fortran 90 package for the integration of full adaptive mesh refinement (AMR) features within a multidimensional finite difference model written in Fortran. Its main objective is to simplify the integration of AMR potentialities within an existing model with minimal changes. Capabilities of this package include the management of an arbitrary number of grids, horizontal and/or vertical refinements, dynamic regridding, parallelization of the grids interactions on distributed memory computers. AGRIF requires the model to be discretized on a structured grid, like it is typically done in ocean or atmosphere modelling. As an example, AGRIF is currently used in the following ocean models: MARS (a coastal model developed at IFREMER-France), ROMS (a regional model developed jointly at Rutgers and UCLA universities), NEMO ocean modelling system (a general circulation model used by the French and European scientific community) and HYCOM (a regional model developed jointly by University of Miami and the French Navy).
Recent applications produced by the NEMO-AGRIF system are described in ,.
AGRIF is licensed under a GNU (GPL) license and can be downloaded at
its web site (http://
NEMOVAR is a state-of-the-art multi-incremental variational data assimilation system dedicated to the European ocean modelling platform NEMO for research and operational applications. It is co-developed by MOISE, CERFACS (FR), ECMWF (EU) and MetOffice (UK) under the CeCILL license, written in Fortran and Python. It is now in use in both ECMWF and MetOffice for their operational oceanic forecasting systems. It has also been used for specific studies in collaboration with Mercator-Ocean, LPO, LOCEAN and LEGI in France and University of Namur in Belgium. It has been adopted as the ocean analysis component in the FP7 project ERA-Clim2 (01/2014-12/2016).
Previously part of NEMOVAR, NEMO-TAM (Tangent and adjoint models for NEMO) that have been developed by the MOISE team will be now distributed directly by the NEMO consortium. The first official tagged release including NEMO-TAM has been published early 2013.
Laurent Gilquin is one of the authors of the R package sensitivity (see http://
Céline Helbert is now the maintainer of the packages DiceDesign (see http://
In his PhD, Jérémie Demange has worked on advection-diffusion schemes for ocean models (Supervisors : L. Debreu, P. Marchesiello (IRD)). His work focuses on the link between tracers (temperature and salinity) and momentum advection and diffusion in the non hyperbolic system of equations typically used in ocean models (the so called primitive equations with hydrostatic and Boussinesq assumptions). We also investigated the use of a depth dependent barotropic mode in free surface ocean models. When most ocean models assume that this mode is vertically constant, we have shown that the use of the true barotropic mode, derived from a normal mode decomposition, allows more stability and accuracy in the representation of external gravity waves (). A special focus has also been put on the numerical representation of internal gravity waves (IGW). The normal mode decomposition also allows the computation of IGW characteristic variables and speeds and thus enables the derivation of monotonic advection schemes ().
In 2014, we worked on the stability constraints for oceanic numerical models (). The idea is to carry a deep analysis of these constraints in order
to propose new time stepping algorithms for ocean models.
Except for vertical diffusion (and possibly the external mode and bottom drag), oceanic models usually rely on explicit time-stepping
algorithms subject to Courant-Friedrichs-Lewy (CFL) stability criteria. Implicit methods could be unconditionally stable, but an algebraic
system must be solved at each time step and other considerations such as accuracy and efficiency are less straightforward to achieve.
Depending on the target application, the process limiting the maximum allowed time-step is generally different. In this paper, we
introduce offline diagnostics to predict stability limits associated with internal gravity waves, advection, diffusion, and rotation.
This suite of diagnostics is applied to a set of global, regional and coastal numerical simulations with several horizontal/vertical
resolutions and different numerical models. We show that, for resolutions finer that 1/2
The coupling of different types of models is gaining more and more attention recently. This is due, in particular, to the needs of more global models encompassing different disciplines (e.g. multi-physics) and different approaches (e.g. multi-scale, nesting). Also, the possibility to assemble different modeling units inside a friendly modelling software platform is an attractive solution compared to developing more and more complex global models. More specifically one may want to couple 1D to 2D or 3D models, such as Shallow Water and Navier Stokes models: this was the framework of our partnership with EDF, now extended with ARTELIA Group.
Following the work done by Manel Tayachi in her PhD, Medhi Pierre Daou has started implementing and analyzing a coupling between 1D shallow water equations and 3D Navier Stokes equations. In the context of our partnership with ARTELIA, he uses industrial codes (Mascaret, Telemac and OpenFoam). A first implementation has been realized in an academic testcase, and a second one is presently under implementation in a much more realistic context, in the framework of the European project CRISMA.
Coupling methods routinely used in regional and global climate models do not provide the exact solution to the ocean-atmosphere problem, but an approached one . For the last few years we have been actively working on the analysis of Schwarz waveform relaxation to apply this type of iterative coupling method to air-sea coupling , , . In the context of the simulation of tropical cyclone, sensitivity tests to the coupling method have been carried out in an ensemblist approach. We showed that with a mathematically consistent coupling, compared to coupling methods en vogue in existing coupled models, the spread of the ensemble is reduced, thus indicating a much reduced uncertainty in the physical solution. In 2014, this work has been the subject of several invited conferences , , , and collaborations with geophysicists , , .
Past year has also been dedicated to the establishment of strong collaborations between the applied mathematics and the climate community to assess the impact of our work on IPCC-like climate models and to go further in the theoretical work by including the formulation of physical parameterizations. As a results, a PhD-thesis (C. Pelletier) funded by Inria has started in fall 2014 in collaboration with the LSCE (Laboratoire des Sciences du Climat et de l'Environnement). Moreover a PPR (Projet à partenariat renforcé) called SIMBAD (SIMplified Boundary Atmospheric layer moDel for ocean modeling purposes) is funded by Mercator-Ocean for the next three years. The aim of this project in collaboration with Meteo-France, Ifremer, LMD, and LOCEAN is to derive a metamodel to force high-resolution oceanic operational models for which the use of a full atmospheric model is not possible due to a prohibitive computational cost.
In order to lower the computational cost of the variational data assimilation process, we investigate the use of multigrid methods to solve the associated optimal control system. On a linear advection equation, we study the impact of the regularization term on the optimal control and the impact of discretization errors on the efficiency of the coarse grid correction step. We show that even if the optimal control problem leads to the solution of an elliptic system, numerical errors introduced by the discretization can alter the success of the multigrid methods. The view of the multigrid iteration as a preconditioner for a Krylov optimization method leads to a more robust algorithm. A scale dependent weighting of the multigrid preconditioner and the usual background error covariance matrix based preconditioner is proposed and brings significant improvements. This work is presented in a paper submitted to QJRMS ( ). A book chapter on multiresolution methods for data assimilation has also been published ().
One of the main limitations of the current operational variational data assimilation techniques is that they assume the model to be perfect, mainly because of computing cost issues. Numerous researches have been carried out to reduce the cost of controlling model errors by controlling the correction term only in certain privileged directions or by controlling only the systematic and time correlated part of the error.
Both the above methods consider the model errors as a forcing term in the model equations. Trémolet (2006) describes another approach where the full state vector (4D field: 3D spatial + time) is controlled. Because of computing cost one cannot obviously control the model state at each time step. Therefore, the assimilation window is split into sub-windows, and only the initial conditions of each sub-window are controlled, the junctions between each sub-window being penalized. One interesting property is that, in this case, the computation of the gradients, for the different sub-windows, are independent and therefore can be done in parallel.
This method is now implemented in a realistic oceanic framework using OPAVAR/ NEMOVAR. The plan is to extend this study focusing on the parallel aspects of such approach.
At the present time the observation of Earth from space is done by more than thirty satellites. These platforms provide two kinds of observational information:
Eulerian information as radiance measurements: the radiative properties of the earth and its fluid envelops. These data can be plugged into numerical models by solving some inverse problems.
Lagrangian information: the movement of fronts and vortices give information on the dynamics of the fluid. Presently this information is scarcely used in meteorology by following small cumulus clouds and using them as Lagrangian tracers, but the selection of these clouds must be done by hand and the altitude of the selected clouds must be known. This is done by using the temperature of the top of the cloud.
MOISE was the leader of the ANR ADDISA project dedicated to the assimilation of images, and is a member of its follow-up GeoFluids (along with EPI FLUMINANCE and CLIME, and LMD, IFREMER and Météo-France) that just ended in 2013.
During the ADDISA project we developed Direct Image Sequences Assimilation (DISA) and proposed a new scheme for the regularization of optical flow problems . Thanks to the nonlinear brightness assumption, we proposed an algorithm to estimate the motion between two images, based on the minimization of a nonlinear cost function. We proved its efficiency and robustness on simulated and experimental geophysical flows . As part of the ANR project GeoFluids, we are investigating new ways to define distance between a couple of images. One idea is to compare the gradient of the images rather than the actual value of the pixels. This leads to promising results. Another idea, currently under investigation, consists in comparing main structures within each image. This can be done using, for example, a wavelet representation of images. Both approaches have been compared, in particular their relative merits in dealing with observation errors, in a paper accepted late 2014 and presented in several national , , and international conferences , , .
Vincent Chabot also defended his PhD in July 2014 .
In recent developments we have also used "Level Sets" methods to describe the evolution of the images. The advantage of this approach is that it permits, thanks to the level sets function, to consider the images as a state variable of the problem. We have derived an Optimality System including the level sets of the images.
Within the optimal transport project TOMMI funded by the ANR white program (started mid 2011), a new optimization scheme based on proximal splitting method has been proposed to solve the dynamic optimal transport problem. We investigate the use of optimal transport based distances for data assimilation. N. Feyeux started his PhD on this subject last year. The study is still under investigation, but preliminary encouraging results have already been presented twice, in France and Austria .
The Back and Forth Nudging (BFN) algorithm has been recently introduced for simplicity reasons, as it does not require any linearization, nor adjoint equation, or minimization process in comparison with variational schemes. Nevertheless it provides a new estimation of the initial condition at each iteration.
Previous theoretical results showed that BFN was often ill-posed for viscous partial differential equations. To overcome this problem, we proposed a new version of the algorithm, which we called the Diffusive BFN, and which showed very promising results on one-dimensional viscous equations. Experiments on more sophisticated geophysical models, such as Shallow-Water equations and NEMO ocean model are still in progress, in collaboration with University of Nice, and have been presented at the ICIPE conference .
A variational data assimilation technique is applied to the identification of the optimal boundary conditions for a simplified configuration of the NEMO model. A rectangular box model placed in mid-latitudes, and subject to the classical single or double gyre wind forcing, is studied. The model grid can be rotated on a desired angle around the center of the rectangle in order to simulate the boundary approximated by a staircase-like coastlines. The solution of the model on the grid aligned with the box borders was used as a reference solution and as artificial observational data. It is shown in that optimal boundary has a rather complicated geometry which is neither a staircase, nor a straight line. The boundary conditions found in the data assimilation procedure bring the solution toward the reference solution allowing to correct the influence of the rotated grid (see fig. ).
Adjoint models, necessary to variational data assimilation, have been produced by the TAPENADE software, developed by the SCIPORT team. This software is shown to be able to produce the adjoint code, that can be used in data assimilation after a memory usage optimization.
We are heavily involved in the development of NEMOVAR (Variational assimilation for NEMO). For several years now, we built a working group (coordinated by A. Vidard) in order to bring together various NEMOVAR user-groups with diverse scientific interests (ranging from singular vector and sensitivity studies to specific issues in variational assimilation). It has led to the creation of the VODA (Variational Ocean Data Assimilation for multi scales applications) ANR project (ended in 2012). A new project, part of a larger EU-FP7 project (ERA-CLIM2) has just started in january 2014.
The project aims at delivering a common NEMOVAR platform based on NEMO platform for 3D and 4D variational assimilation. Following 2009-11 VODA activities, a fully parallel version of NEMOTAM (Tangent and Adjoint Model for NEMO) is now available for the community in the standard NEMO version. This version is based on the released 3.4.1 version of NEMO.
We are also investigating variational data assimilation methods applied to high resolution ocean numerical models (see figure ). This part of the project is now well advanced and encouraging preliminary results are available on an idealised numerical configuration of an oceanic basin. Several novative diagnostics have been also developed in this framework as part of P.A. Bouttier’s PhD that was defended early 2014 .
Lastly, multi resolution algorithms have been developed to solve the variational problem. An EU-ITN (International Training Network) project is going to be submitted early 2015 to continue working in this particular aspect.
In collaboration with C. Ritz (CNRS, Laboratoire de Glaciologie et Geophysique de l'Environnement (LGGE), Grenoble), we aim to develop inverse methods for ice cap models.
In the framework of global warming, the evolution of sea level is a major but ill-known phenomenon. It is difficult to validate the models which are used to predict the sea level elevation, because observations are heterogeneous and sparse.
Data acquisition in polar glaciology is difficult and expensive. Satellite data have a good spatial coverage, but they allow only indirect observation of the interesting data. Moreover, ice dynamics processes are highly non linear and involve many feedback loops, so that classical linear data assimilation gives poor results.
B. Bonan defended his PhD in November 2013 on this subject. We implemented the Ensemble Transform Kalman Filter (ETKF) algorithm for a flowline Shallow-Ice model, called Winnie, developed by C. Ritz at LGGE. On twin experiments we got interesting results, very promising for the future, as we want to implement this method into a full 3D model. A journal paper has published on this subject , and the results have been presented in the conference .
Forecasting geophysical systems require complex models, which sometimes need to be coupled, and which make use of data assimilation. The objective of this project is, for a given output of such a system, to identify the most influential parameters, and to evaluate the effect of uncertainty in input parameters on model output. Existing stochastic tools are not well suited for high dimension problems (in particular time-dependent problems), while deterministic tools are fully applicable but only provide limited information. So the challenge is to gather expertise on one hand on numerical approximation and control of Partial Differential Equations, and on the other hand on stochastic methods for sensitivity analysis, in order to develop and design innovative stochastic solutions to study high dimension models and to propose new hybrid approaches combining the stochastic and deterministic methods.
A first task is to develop tools for estimated sensitivity indices.
In variance-based sensitivity analysis, a classical tool is the method of Sobol' which allows to compute Sobol' indices using Monte Carlo integration. One of the main drawbacks of this approach is that the estimation of Sobol' indices requires the use of several samples. For example, in a
In a recent work we introduce a new approach to estimate all first-order Sobol' indices by using only two samples based on replicated latin hypercubes and all second-order Sobol' indices by using only two samples based on replicated randomized orthogonal arrays. We establish theoretical properties of such a method for the first-order Sobol' indices and discuss the generalization to higher-order indices. As an illustration, we propose to apply this new approach to a marine ecosystem model of the Ligurian sea (northwestern Mediterranean) in order to study the relative importance of its several parameters. The calibration process of this kind of chemical simulators is well-known to be quite intricate, and a rigorous and robust — i.e. valid without strong regularity assumptions — sensitivity analysis, as the method of Sobol' provides, could be of great help. The computations are performed by using CIGRI, the middleware used on the grid of the Grenoble University High Performance Computing (HPC) center. We are also applying these estimates to calibrate integrated land use transport models. As for these models, some groups of inputs are correlated, Laurent Gilquin extended the approach based on replicated designs for the estimation of grouped Sobol' indices .
We can now wonder what are the asymptotic properties of these new estimators, or also of more classical ones. In , the authors deal with asymptotic properties of the estimators. In , the authors establish also a multivariate central limit theorem and non asymptotic properties.
Another point developed in the team for sensitivity analysis is model reduction. To be more precise regarding model reduction, the aim is to reduce the number of unknown variables (to be computed by the model), using a well chosen basis. Instead of discretizing the model over a huge grid (with millions of points), the state vector of the model is projected on the subspace spanned by this basis (of a far lesser dimension). The choice of the basis is of course crucial and implies the success or failure of the reduced model. Various model reduction methods offer various choices of basis functions. A well-known method is called “proper orthogonal decomposition" or “principal component analysis". More recent and sophisticated methods also exist and may be studied, depending on the needs raised by the theoretical study. Model reduction is a natural way to overcome difficulties due to huge computational times due to discretizations on fine grids. In , the authors present a reduced basis offline/online procedure for viscous Burgers initial boundary value problem, enabling efficient approximate computation of the solutions of this equation for parametrized viscosity and initial and boundary value data. This procedure comes with a fast-evaluated rigorous error bound certifying the approximation procedure. The numerical experiments in the paper show significant computational savings, as well as efficiency of the error bound.
When a metamodel is used (for example reduced basis metamodel, but also kriging, regression,
Let us come back to the output of interest. Is it possible to get better error certification when the output is specified. A work in this sense has been submitted, dealing with goal oriented uncertainties assessment .
An important challenge for stochastic sensitivity analysis is to develop methodologies which work for dependent inputs. For the moment, there does not exist conclusive results in that direction. Our aim is to define an analogue of Hoeffding decomposition in the case where input parameters are correlated. Clémentine Prieur supervised Gaëlle Chastaing's PhD thesis on the topic (defended in September 2013) . We obtained first results , deriving a general functional ANOVA for dependent inputs, allowing defining new variance based sensitivity indices for correlated inputs. We then adapted various algorithms for the estimation of these new indices. These algorithms make the assumption that among the potential interactions, only few are significant. Two papers have been recently accepted and . We also considered (see the paragraph ) the estimation of groups Sobol' indices, with a procedure based on replicated designs. These indices provide information at the level of groups, and not at a finer level, but their interpretation is still rigorous.
Céline Helbert and Clémentine Prieur supervise the PhD thesis of Simon Nanty (funded by CEA Cadarache). The subject of the thesis is the analysis of uncertainties for numerical codes with temporal and spatio-temporal input variables, with application to safety and impact calculation studies. This study implies functional dependent inputs. A first step is the modeling of these inputs, and a paper has been submitted .
Federico Zertuche's PhD concerns the modeling and prediction of a digital output from a computer code when multiple levels of fidelity of the code are available. A low-fidelity output can be obtained, for example on a coarse mesh. It is cheaper, but also much less accurate than a high-fidelity output obtained on a fine mesh. In this context, we propose new approaches to relieve some restrictive assumptions of existing methods ( , ): a new estimation method of the classical cokriging model when designs are not nested and a nonparametric modeling of the relationship between low-fidelity and high-fidelity levels. The PhD takes place in the REDICE consortium and in close link with industry. The first part of the thesis was also dedicated to the development of a case study in fluid mechanics with CEA in the context of the study of a nuclear reactor.
The second part of the thesis was dedicated to the development of a new sequential approach based on a course to fine wavelets algorithm. Federico Zertuche presented his work at the annual meeting of the GDR Mascot Num in 2014 .
A main advantage of Variational Methods in Data Assimilation is to exhibit a so-called Optimality System (OS) that contains all the available information : model, data, statistics. Therefore a sensitivity analysis (i.e. the evaluation of the gradient) with respect to the inputs of the model has to be carried out on the OS. With iMECH and INM we have applied sensitivity analysis in the framework of a pollution problem in a lake. The application of second order analysis for sensitivity permits to evaluate the sensitivity with respect to observations and furthermore to determine the optimal location of new sensors at points with the highest sensitivity , .
This methodology has been applied to
Oil Spill. These last years have known several disasters produced by wrecking of ships and drifting platforms with severe consequences on the physical and biological environments. In order to minimize the impact of these oil spills its necessary to predict the evolution of oil spot. Some basic models are available and some satellites provide images on the evolution of oil spots. Clearly this topic is a combination of the two previous one: data assimilation for pollution and assimilation of images. A theoretical framework has been developed with Dr. Tran Thu Ha (iMech).
Data Assimilation in Supercavitation (with iMech). Some self propelled submarine devices can reach a high speed thanks to phenomenon of supercavitation: an air bubble is created on the nose of the device and reduces drag forces. Some models of supercavitation already exist but are working on two applications of variational methods to supercavitation:
Parameter identification : the models have some parameters that can not be directly measured. From observations we retrieve the unknown parameters using a classical formalism of inverse problems.
Shape Optimization. The question is to determine an optimum design of the shape of the engine in order to reach a maximum speed.
We are interested in the tracking of mesoscale convective systems.A particular region of interest is West Africa. Data and hydrological expertise is provided by T. Vischel and T. Lebel (LTHE, Grenoble).
A first approach involves adapting the multiple hypothesis tracking (MHT) model originally designed by the NCAR (National Centre for Atmospheric Research) for tracking storms to the data for West Africa. With A. Makris (working on a post-doctoral position), we proposed a Bayesian approach , which consists in considering that the state at time t is composed on one hand by the events (birth, death, splitting, merging) and on the other hand by the targets' attributes (positions, velocities, sizes, ... ). The model decomposes the state into two sub-states: the events and the targets positions/attributes. The events are updated first and are conditioned to the previous targets sub-state. Then given the new events the target substate is updated. A simulation study allowed to verify that this approach improves the frequentist approach by Storlie et al. (2009). It has been tested on simulations and investigated in the specific context of real data on West Africa (submitted paper). Using PHD (probability hypothesis density) filters adapted to our problem, generalising recent developments in particle filtering for spatio-temporal branching processes (e.g. ) could be an interesting alternative to explore. The idea of a dynamic, stochastic tracking model should then provide the base for generating rainfall scenarios over a relatively vast area of West Africa in order to identify the main sources of variability in the monsoon phenomenon.
Studying risks in a spatio-temporal context is a very broad field of research and one that lies at the heart of current concerns at a number of levels (hydrological risk, nuclear risk, financial risk etc.). Stochastic tools for risk analysis must be able to provide a means of determining both the intensity and probability of occurrence of damaging events such as e.g. extreme floods, earthquakes or avalanches. It is important to be able to develop effective methodologies to prevent natural hazards, including e.g. the construction of barrages.
Different risk measures have been proposed in the one-dimensional framework . The most classical ones are the return level (equivalent to the Value at Risk in finance), or the mean excess function (equivalent to the Conditional Tail Expectation CTE). However, most of the time there are multiple risk factors, whose dependence structure has to be taken into account when designing suitable risk estimators. Relatively recent regulation (such as Basel II for banks or Solvency II for insurance) has been a strong driver for the development of realistic spatio-temporal dependence models, as well as for the development of multivariate risk measurements that effectively account for these dependencies.
We refer to for a review of recent extensions of the notion of return level to the multivariate framework. In the context of environmental risk, proposed a generalization of the concept of return period in dimension greater than or equal to two. Michele et al. proposed in a recent study to take into account the duration and not only the intensity of an event for designing what they call the dynamic return period. However, few studies address the issues of statistical inference in the multivariate context. In , , we proposed non parametric estimators of a multivariate extension of the CTE. As might be expected, the properties of these estimators deteriorate when considering extreme risk levels. In collaboration with Elena Di Bernardino (CNAM, Paris), Clémentine Prieur is working on the extrapolation of the above results to extreme risk levels.
Elena Di Bernardino, Véronique Maume-Deschamps (Univ. Lyon 1) and Clémentine Prieur also derived an estimator for bivariate tail . The study of tail behavior is of great importance to assess risk.
With Anne-Catherine Favre (LTHE, Grenoble), Clémentine Prieur supervises the PhD thesis of Patricia Tencaliec. We are working on risk assessment, concerning flood data for the Durance drainage basin (France). The PhD thesis started in October 2013. A first paper on data reconstruction has been submitted. It was a necessary step as the initial series contained many missing data.
This research is the subject of a collaboration with Venezuela (Professor Jose R. Leon, Caracas Central University) and is partly funded by an ECOS Nord project.
We are focusing our attention on models derived from the linear Fokker-Planck equation. From a probabilistic viewpoint, these models have received particular attention in recent years, since they are a basic example for hypercoercivity. In fact, even though completely degenerated, these models are hypoelliptic and still verify some properties of coercivity, in a broad sense of the word. Such models often appear in the fields of mechanics, finance and even biology. For such models we believe it appropriate to build statistical non-parametric estimation tools. Initial results have been obtained for the estimation of invariant density, in conditions guaranteeing its existence and unicity and when only partial observational data are available. A paper on the non parametric estimation of the drift has been accepted recently (see Samson et al., 2012, for results for parametric models). As far as the estimation of the diffusion term is concerned, a paper has been submitted , in collaboration with J.R. León (Caracas, Venezuela) and P. Cattiaux (Toulouse). Recursive estimators have been also proposed by the same authors in recently submitted.
Note that Professor Jose R. León (Caracas, Venezuela) is now funded by an international Inria Chair and will spend one year in our team, allowing to collaborate further on parameter estimation.
Given the complexity of modern urban areas, designing sustainable policies calls for more than sheer expert knowledge. This is especially true of transport or land use policies, because of the strong interplay between the land use and the transportation systems. Land use and transport integrated (LUTI) modelling offers invaluable analysis tools for planners working on transportation and urban projects. Yet, very few local authorities in charge of planning make use of these strategic models. The explanation lies first in the difficulty to calibrate these models, second in the lack of confidence in their results, which itself stems from the absence of any well-defined validation procedure. Our expertise in such matters will probably be valuable for improving the reliability of these models. To that purpose we participated to the building up of the ANR project CITiES led by the STEEP EPI. This project started early 2013 and two PhD about sensitivity analysis and calibration were launched late 2013. This work led to conference papers , and a submitted journal paper
A 1-year contract with NOVELTIS on the thematic “Développement de démonstrateurs avec AGRIF": see
A 4-year contract named ReDICE (Re Deep Inside Computer Experiments) with EDF, CEA, IRSN, RENAULT, IFP on the thematic computer experiments.
A 3-year contract (2015-2018) Projet à Partenariat Renforcé SIMBAD (SIMplified Boundary Atmospheric layer moDel for ocean modeling purposes) with the civil private company Mercator-Ocean (coordinator : F. Lemarié)
A 3-year contract with ARTELIA Group: funding for the PhD thesis of M.P. Daou (CIFRE): see
Clémentine Prieur is a member of the project "Soutien à l'Excellence et à l'Innovation Grenoble INP" MEPIERA (MEthodologies innovantes Pour l'Ingénierie de l'Eau et des Risques Associés) leaded by A.- C. Favre (LTHE).
N. Feyeux PhD is sponsored by the action ARC3 Environment of the Region Rhone-Alpes.
LGGE Grenoble, Edge team (C. Ritz, O. Gagliardini, F. Gillet-Chaulet, G. Durand), see paragraphs .
LTHE, A.C. Favre: hydrological risk assessment.
LTHE, Thierry Lebel, Théo Vischel: tracking of mesoscale convective systems,
LTHE, MISTIS, LJK: AGIR project. Clémentine Prieur obtained the funding for a thesis on risk assessment.
Univ. Lyon 1 collaboration with V. Maume-Deschamps.
Participants | Inria Project-Team | Research topic | Link |
M. Nodet | LEMON | Life-Fluid coupling |
https:// |
C.Prieur, P. Tencaliec | MISTIS | hydrological risk assessment | |
L. Gilquin, C. Helbert, C.Prieur, A. Vidard | STEEP | Calibration and sensitivity analysis for LUTI models | |
C.Prieur, L. Viry | GRAAL | Grid deployment for the study of West African Monsoon | |
A. Vidard M. Nodet F.X. Le Dimet | CLIME, FLUMINANCE | Image assimilation | |
A. Vidard, M. Nodet, E.Kazantsev | SCIPORT | Ocean Adjoint Modelling | , |
C. Prieur chairs GdR MASCOT NUM, in which are also involved M. Nodet, E. Blayo, C. Helbert, L. Viry, S. Nanty, L. Gilquin.
C. Prieur is the leader of the LEFE/MANU project MULTIRISK (2014-2016) on multivariate risk analysis, which gathers experts from Lyon 1 University, CNAM, LSCE and Grenoble University mainly.
M. Nodet is involved in GDR Calcul and GDR Ondes.
A. Vidard leads a group of projects gathering multiple partners in France and UK on the topic "Variational Data Assimilation for the NEMO/OPA9 Ocean Model", see .
E. Blayo is the chair of the CNRS-INSU research program on mathematical and numerical methods for ocean and atmosphere LEFE-MANU. http://
L. Debreu is the coordinator of the national group COMODO (Numerical Models in Oceanography)
E.Kazantsev, E.Blayo, F. Lemarié participate in the project "PACO - Vers une meilleure paramétrisation de la côte et des conditions limites dans les modèles d'océan" supported by LEFE-GMMC and LEFE-MANU .
A 4-year ANR contract: ANR TOMMI (Transport Optimal et Modèles Multiphysiques de l'Image), see paragraphs ,.
A 4 year ANR contract (2011-2015): ANR COMODO (Communauté de Modélisation Océanographique) on the thematic "Numerical Methods in Ocean Modelling". (coordinator L. Debreu)
A 4-year ANR contract (2014-2018) : ANR HEAT (Highly Efficient Atmospheric modelling) on the development of numerical schemes for atmospheric models (coordinator: T. Dubos, LMD)
A 3.5 year ANR contract: ANR CITiES (numerical models project selected in 2012). http://
Type: COOPERATION
Instrument: Specific Targeted Research Project
Program: Collaborative project FP7-SPACE-2013-1
Project acronym: ERA-CLIM2
Project title: European Reanalysis of the Global Climate System
Duration: 01/2014 - 12/2016
Coordinator: Dick Dee (ECMWF, Europe)
Other partners: Met Office (UK), EUMETSAT (Europe), Univ Bern (CH), Univ. Vienne (AT), FFCUL (PT), RIHMI-WDC (RU), Mercator-Océan (FR), Météo-France (FR), DWD (DE), CERFACS (FR), CMCC (IT), FMI (FI), Univ. Pacifico (CL), Univ. Reading (UK), Univ. Versailles St Quentin en Yvelines (FR)
Inria contact: Arthur Vidard
Partner: GDR-E CONEDP
Subject: Control of Partial Differential Equations.
Partner: University of Reading, Department of Meteorology, Department of Mathematics
Subject: Data assimilation for geophysical systems.
Partner: European Centre for Medium Range Weather Forecast. Reading (UK)
World leading Numerical Weather Center, that include an ocean analysis section in order to provide ocean initial condition for the coupled ocean atmosphere forecast. They play a significant role in the NEMOVAR project in which we are also partner.
Partner: Met Office (U.K) National British Numerical Weather and Oceanographic service. Exceter (UK).
We do have a strong collaboration with their ocean initialization team through both our NEMO, NEMO-ASSIM and NEMOVAR activities. They also are our partner in the NEMOVAR consortium.
Jose R. León (UCV, Caracas) was funded for a 1,5 months invitation.
C. Prieur collaborates with Jose R. León (UCV, Central University of Caracas).
C. Prieur is leader of a project ECOS Nord with Venezuela (2012-2015).
Jose-Raphael Leon-Ramos, Caracas University, 3 months
Victor Shutyaev, Russian Academy of Sciences, 2 weeks
M. Nodet visited the University of Reading Data Assimilation group and gave a seminar.
F.-X. Le Dimet visited the Florida State University, department of meteorology and oceanography during three weeks in June 2014 (Invitation of Prof. Xiaolei Zou). One seminar given on assimilation of images .
F.-X. Le Dimet visited the Harbin Institute of Technology, department of mathematics during one month in October 2014 (Invitation of Prof. Jianwei Ma). A serie of four one-hour seminars has been delivered on variational methods in data assimilation.
C. Prieur and L. Viry co-organized a research school on uncertianty quantification at Ecole de Physique des Houches (ASPEN 2013, 2014).
M. Nodet and E. Blayo (with S. Ricci, G. Desroziers and M. Bocquet) co-organised the Data Assimilation National Conference in Toulouse.
C. Prieur was a member of the jury for the PhD prize Jacques Neveu.
M. Nodet and E. Blayo (with S. Ricci, G. Desroziers and M. Bocquet) co-organised (with others, see above) the program of the Data Assimilation National Conference in Toulouse.
M. Nodet is a reviewer for the international conference "Emerging Trends in Applied Mathematics : Dedicated to the Memory of Sir Asutosh Mookerjee", Calcutta February 12-14, 2014, proceedings of the conference from Springer.
E.Blayo: reviewer for Ocean Modelling, Monthly Weater Review, Nonlinear Processes in Geophysics.
L. Debreu: reviewer for Ocean Modelling, Applied Mathematical Modelling.
A. Vidard: reviewer for Ocean Modelling, Tellus A, Inverse problems, Monthly Weather Review
M. Nodet: reviewer for the journals Esaim COCV and Non Linear Processes in Geophysics.
E. Kazantsev: reviewer for the Journal of Comp. Phys., World Journal of Modelling and Simulation.
Licence : C. Prieur, Statistics for biologists, 113.5, niveau L2, Grenoble, FRANCE
Licence : C. Prieur, Statistics, 52, niveau L1, Ensimag (Grenoble), FRANCE
Licence : M. Nodet, Mathématiques pour l'ingénieur, 80h, L1, UJF Grenoble
Licence : M. Nodet, Statistiques pour la biologie, 80h, L2, UJF Valence
Master : M. Nodet, Inverse methods and data assimilation, 30h, M2, UJF Grenoble
Master : C. Prieur, Stochastical approaches for uncertainty quantification, 27, niveau M2, Grenoble, France Laurent Debreu, Modélisation numérique de l'océan, 8h eq TP, M2, Université de Brest, FRANCE
Doctorat : Laurent Debreu, Formation doctorale nationale Modélisation numérique de l'océan et de l'atmosphère, 24-28 novembre 2014, Grenoble, France. With T. Dubos (LMD/Ecole Polytechnique, Paris), G. Roullet (Brest University), F. Hourdin (LMD/CNRS, Paris)
Doctorat : Eric Blayo, Arthur Vidard, Introduction to Data Assimilation, 20h,University of Grenoble, France
PhD : Pierre-Antoine Bouttier, Assimilation variationnelle de données altimétriques dans le modèle océanique NEMO : Exploration de l'effet des non-linéarités dans une configuration simplifiée à haute résolution , University of Grenoble, 2014, .
PhD : Jérémie Demange, Schémas numériques d'advection et de propagation d'ondes de gravité dans les modèles de circulation océaniques, University of Grenoble, 21 octobre 2014, L. Debreu, P. Marchesiello.
PhD : V. Chabot, Étude de représentations parci- monieuses des statistiques d’erreur d’observation pour différentes métriques. Application à l’assimilation d’images. UJF Grenoble, A. Vidard et M. Nodet
PhD in progress : Nelson Feyeux, Application du transport optimal pour l’assimilation de données images, novembre 2013, Arthur Vidard, Maëlle Nodet
PhD in progress : Thomas Capelle, Calibration of LUTI models, octobre 2013, Peter Sturm, Arthur Vidard
PhD in progress : Mehdi-Pierre Daou, Développement d’une méthodologie de couplage multi-modèles avec changements de dimension. Validation sur un cas-test réaliste en dynamique littorale, May 2013, E. Blayo and A. Rousseau
PhD in progress : Rémy Pellerej, Assimilation de données pour les modèles couplés, octobre 2014, Arthur Vidard, Florian Lemarié
PhD in progress : L. Gilquin, Uncertainty quantification for LUTI models, Oct. 2013, C. Prieur and E. Arnaud (STEEP)
PhD in progress : P. Tencaliec, Oct. 2013, multivariate risk for Durance streamflow data, C. Prieur and A.-C. Favre (LTHE, hydrology lab in Grenoble)
PhD in progress : S. Nanty, Uncertainty quantification for functional and dependent inputs, Oct. 2012, C. Prieur and C. Helbert (Centrale Lyon).
PhD in progress : C. Pelletier, Etude mathématique et numérique de la formulation des modèles de climat global, Décembre 2014, E. Blayo, F. Lemarié, P. Braconnot
E.Blayo
21 janvier 2014 - PhD thesis of Mélanie Rochoux, Ecole Centrale de Paris (reporter)
23 janvier 2014 - PhD thesis of Natacha Djath, University of Grenoble (president)
2 jun 2014 - HDR thesis of Jean-Michel Brankart, University of Grenoble (president)
2 octobre 2014 - PhD thesis of Pierre Jolivet, University of Grenoble (president)
13 octobre 2014 - PhD thesis of Laurent Berenguer, University of Lyon 1 (examiner)
28 nov 2014 - PhD thesis of Gildas Mainsant, University of Grenoble (president)
8 dec 2014 - PhD thesis of Abdoulaye Samake, University of Grenoble (president)
C. Prieur
11 déc. 2014 — PhD thesis of Jeremy Chardon, Université Grenoble Alpes (president)
26 nov. 2014 — PhD thesis of Stéphane Veys, Université Grenoble 1 (examiner)
13 nov. 2014 — PhD thesis of Prashant Rai, Ecole Centrale de Nantes (president)
28 nov. 2014 — PhD thesis of Henri Sohier, ONERA Toulouse (external reporter)
13 oct. 2014 — PhD thesis of Philomène Favier, Irstea Grenoble (examiner)
16 avr. 2014 — PhD thesis of Julie Oger, University François Rabelais, Tours (reporter)
A. Vidard
4 fev. 2014 —PhD thesis of Pierre-Antoine Bouttier, University of Grenoble (examinateur)
16 dec. 2014 — PhD thesis of Yin Yang, University of Rennes (reporter)
Since 2010, Ch. Kazantsev is the Director of the IREM of Grenoble http://
the week of Mathematics, the 17 and 21 March 2014,
the festival "Remue-Méninges", Echirolles, 22-25 April 2014,
the internship MATHC2+ in June and in October 2014,
the "Journées Nationales de l'APMEP": presentation of two workshops with M. Gandit "Sky: between historical and simulated data",
the "Fête de la science" with workshops "28 nuances de sciences" and "Récréation mathématique pour tous".
M. Nodet gave talks and made posters to explain how the "problem-based learning" (a special case of "active learning") approach was introduced in mathematics for Grenoble university undergraduates, see e.g. and .
M. Nodet (in collaboration with A. Rousseau and S. Minjeaud) wrote a chapter of an outreach book "Brèves de maths" .
M. Nodet gave outreach talks about "mathematics for environmental modelling" in various occasions: for the Grenoble maths Olympiades awards, twice for the "Science Fair 2014" (one for undergraduate students, one for secondary school pupils), for a daily visit of highschool students at Valence university.
M. Nodet supervises a maths club "Math en Jeans" for secondary school pupils at two schools around Grenoble, which consists in proposing open research subjects to the students and supervising them over the year, in collaboration with the teachers.
E. Blayo gave several outreach talks, in particular for the inauguration of the Fédération Rhône-Alpes-Auvergne de Mathématiques (Lyon, February 28), for the ceremony of the Christian Le Provost prize of the French Academy of Science (Saint Brieuc, April 18), and for MathC2+ internships.