STEEP started in January 2010, initially as an Inria “Action Exploratoire” (2010+2011).
It is now an “Équipe Projet Inria” of Inria Grenoble - Rhône-Alpes and is also
affiliated with the Jean Kuntzmann laboratory (LJK
STEEP is an interdisciplinary research team devoted to systemic modelling and simulation of the interactions between the environmental, economic and social factors in the context of a transition to sustainability at local (sub-national) scales. Our goal is to develop decision-making tools to support decision makers in the implementation of this transition by developing simulation and optimization programs. In other words, our objective is to set up some mathematical and computational tools which enable us to provide some parts of the answer to the challenges how to operate the sustainable development at local scales? and which local governance for environmental public policies?.
The work of STEEP follows several research directions, covering different application domains; these are described in “Scientific Foundations” and “Application Domains” respectively.
Environmental issues now pose a threat to human civilization worldwide. They range from falling water tables to eroding soils, expanding deserts, biodiversity loss, rising temperatures, etc. For example, half the world's population lives in countries where water tables are falling as aquifers are being depleted. Roughly a third of the world's cropland is losing topsoil at an excessive rate. Glaciers are melting in all of the world's major mountains. The consequences on the present human societies are critical; they comprise for example a decreasing food security, important population movements (such as climate refugees) and explosive geopolitical tensions.
Sustainable development is often formulated in terms of a required balance between its environmental, economic and social dimensions, but in practice public policies addressing sustainability issues are dominantly oriented towards environment management in Western countries. This approach is problematic to some extent as environmental problems and sustainability issues result from socio-economic phenomena (for example the economic growth model which is strengthened by powerful and polluting technologies). Environmental problems have only recently been the object of media attention and public awareness. Most efforts bear on developing technological solutions. However, it is now clear that this will not be sufficient. We need to rethink our socio-economic and institutional models in order to leave room for a possible paradigm shift. In this perspective, we believe that crucial steps should be taken in research to help elaborating and implementing socio-economic alternatives.
The risks associated with delayed reaction and adaptation times make the situation urgent. Delayed reactions significantly increase the probability of overshoot of the planet carrying capacity followed by uncontrolled and irreversible evolution on a number of fronts . This systemic problem is amplified by two facts: the environment is degrading on all fronts at the same time, and at the global planetary scale, a first in human history.
Although environmental challenges are monitored worldwide, the search for appropriate lines of actions must nevertheless take place at all institutional levels, in particular at local scales.At such scales, the proximity and smaller number of stakeholders allows decision makers to reach a consensus much more easily than at national or international scales. The failure of the recent Copenhagen summit (and for that matter of all climate summits since the adoption of the Kyoto protocol in 1997) is a good illustration of the difficulties encountered in international negotiations. There are significant possibilities for operations at local scales, and the emergency of the situation gives the “think locally to act globally” logic an essential opportunity.
As of now, local decision levels have real political and economic leverage, and are more and more proactive on sustainability issues, either independently or in coordination through nationwide or European networks (we can refer for example to the European GMO-free Regions Network
Urbanization is a global and an ever-increasing trend process, with more than half the human population living in cities.
Although urbanized areas still represent a very small fraction of the total terrestrial surface, urban resource consumption amounts to three-fourths of the annual total in energy, water, building materials, agricultural products etc., and pollution and waste management is a growing concern for urban planners worldwide. In France, for example, even if resource intensity (materials use divided by GDP
Furthermore, urban sprawl is a ubiquitous phenomenon showing no sign of slackening yet, even in countries where rural depopulation has long been stabilized. Urban sprawl in industrialized countries is largely driven by residential suburban growth. This phenomenon has both social and environmental consequences. First it implies an increase of daily mobility. In a context of high dependency on private cars and uncertainty on energy prices, this translates into an increased vulnerability of some population categories. It also induces an increase in greenhouse gas emissions, as well as an irreversible loss of cropland and a fragmentation of ecological habitat, with negative effects on biodiversity. The increasing concerns about climate change and upheaval in the market price of fossil fuels raise many questions about urban energy consumption while reviving the debate on the desirable urban structures and their determinants. Controlling urban sprawl is therefore a key sustainability issue.
Let us mention here that cities cannot be sustainable by themselves and that from this point of view, it does not make sense to focus on the municipality scale (“communes”). We think that it is very important to work at larger scales, typically, at employment catchment areas complemented by the adjacent agricultural and natural zones they are dependent on (that would correspond to the smallest scale for which a systemic analysis could make sense). Nevertheless, let us emphasize that because of resource imports and waste exports (e.g. GHG emissions), for any limited territory, the considered area will always depend on and impact other more or less distant territories. This is one of the key issues when trying to assess local sustainability.
Finally, let us note that the numerous and interrelated pressures exerted by human activities on the environment make the identification of sustainable development pathways arduous in a context of complex and sometimes conflicting stakeholders and socio-ecological interactions. This is why we also think that it is crucial to develop interdisciplinary and integrated approaches; consequently, our proposal tries to address the entire spectrum from scientific expertise to stakeholder decision-help.
STEEP, with its strong background in various areas of applied mathematics and modeling, can be a game changer in three connected key domains: urban economy, and related transportation and land use issues; material flow analysis and ecological accounting; and ecosystem services modeling. The group potential on these fronts relies on its capabilities to strongly improve existing integrated activity / land use / transportation models at the urban level on the one hand, and on the other, to build new and comprehensive decision-help tools for sustainability policies at the local and regional levels, in particular through the analysis of strategic social–environmental trade-offs between various policy options.
The problem we consider is intrinsically interdisciplinary: it draws on social sciences, ecology or science of the planet. The modeling of the considered phenomena must take into account many factors of different nature which interact with varied functional relationships. These heterogeneous dynamics are a priori nonlinear and complex: they may have saturation mechanisms, threshold effects, and may be density dependent. The difficulties are compounded by the strong interconnections of the system (presence of important feedback loops) and multi-scale spatial interactions. Environmental and social phenomena are indeed constrained by the geometry of the area in which they occur. Climate and urbanization are typical examples. These spatial processes involve proximity relationships and neighborhoods, like for example, between two adjacent parcels of land, or between several macroscopic levels of a social organization. The multi-scale issues are due to the simultaneous consideration in the modeling of actors of different types and that operate at specific scales (spatial and temporal). For example, to properly address biodiversity issues, the scale at which we must consider the evolution of rurality is probably very different from the one at which we model the biological phenomena.
In this context, to develop flexible integrated systemic models (upgradable, modular, ...) which are efficient, realistic and easy to use (for developers, modelers and end users) is a challenge in itself. What mathematical representations and what computational tools to use? Nowadays many tools are used: for example, cellular automata (e.g. in the LEAM model), agent models (e.g. URBANSIM), system dynamics (e.g. World3), large systems of ordinary equations (e.g. equilibrium models such as TRANUS), and so on. Each of these tools has strengths and weaknesses. Is it necessary to invent other representations? What is the relevant level of modularity? How to get very modular models while keeping them very coherent and easy to calibrate? Is it preferable to use the same modeling tools for the whole system, or can we freely change the representation for each considered subsystem? How to easily and effectively manage different scales? (difficulty appearing in particular during the calibration process). How to get models which automatically adapt to the granularity of the data and which are always numerically stable? (this has also a direct link with the calibration processes and the propagation of uncertainties). How to develop models that can be calibrated with reasonable efforts, consistent with the (human and material) resources of the agencies and consulting firms that use them?
Before describing our research axes, we provide a brief overview of the types of models
that we are or will be working with.
As for LUTI (Land Use and Transportation Integrated) modeling, we have been using the TRANUS model since the start of our group.
It is the most widely used LUTI model, has been developed since 1982 by the company
Modelistica, and is distributed via Open Source software.
TRANUS proceeds by solving a system of deterministic nonlinear equations and inequalities
containing a number of economic parameters (e.g. demand elasticity parameters, location dispersion parameters, etc.).
The solution of such a system represents an economic equilibrium between supply and demand.
A second LUTI model that will be considered in the near future, within the CITiES project,
is UrbanSim
On the other hand, the scientific domains related to ecosystem services and ecological accounting are much less mature than the one of urban economy from a modelling point of view (as a consequence of our more limited knowledge of the relevant complex processes and/or more limited available data). Nowadays, the community working on ecological accounting develops statistical models based on the enforcement of the mass conservation constraint for accounting for material fluxes through a territorial unit or a supply chain, relying on more or less simple data correlations when the relevant data is missing; the overall modelling makes heavy use of more or less sophisticated linear algebra and constrained optimization techniques. The ecosystem service community has been using statical models too, but is also developing more sophisticated models based for example on system dynamics, multi-agent type simulations or cellular models. In the ESNET project, STEEP will work in particular on a land use/ land cover change (LUCC) modelling environments (Dinamica
In the following, our two main research axes are described, from the point of view of applied mathematical development. The domains of application of this research effort is described in the application section, where some details about the context of each field is given.
The overall calibration of the parameters that drive the equations implemented in the above models is a vital step. Theoretically, as the implemented equations describe e.g. socio-economic phenomena, some of these parameters should in principle be accurately estimated from past data using econometrics and statistical methods like regressions or maximum likelihood estimates, e.g. for the parameters of logit models describing the residential choices of households. However, this theoretical consideration is often not efficient in practice for at least two main reasons. First, the above models consist of several interacting modules. Currently, these modules are typically calibrated independently; this is clearly sub-optimal as results will differ from those obtained after a global calibration of the interaction system, which is the actual final objective of a calibration procedure. Second, the lack of data is an inherent problem.
As a consequence, models are usually calibrated by hand. The calibration can typically take up to 6 months for a medium size LUTI model (about 100 geographic zones, about 10 sectors including economic sectors, population and employment categories). This clearly emphasizes the need to further investigate and at least semi-automate the calibration process. Yet, in all domains STEEP considers, very few studies have addressed this central issue, not to mention calibration under uncertainty which has largely been ignored (with the exception of a few uncertainty propagation analyses reported in the literature).
Besides uncertainty analysis, another main aspect of calibration is numerical optimization. The general state-of-the-art on optimization procedures is extremely large and mature, covering many different types of optimization problems, in terms of size (number of parameters and data) and type of cost function(s) and constraints. Depending on the characteristics of the considered models in terms of dimension, data availability and quality, deterministic or stochastic methods will be implemented. For the former, due to the presence of non-differentiability, it is likely, depending on their severity, that derivative free control methods will have to be preferred. For the latter, particle-based filtering techniques and/or metamodel-based optimization techniques (also called response surfaces or surrogate models) are good candidates.
These methods will be validated, by performing a series of tests to verify that the optimization algorithms are efficient in the sense that 1) they converge after an acceptable computing time, 2) they are robust and 3) that the algorithms do what they are actually meant to. For the latter, the procedure for this algorithmic validation phase will be to measure the quality of the results obtained after the calibration, i.e. we have to analyze if the calibrated model fits sufficiently well the data according to predetermined criteria.
To summarize, the overall goal of this research axis is to address two major issues related to calibration and validation of models: (a) defining a calibration methodology and developing relevant and efficient algorithms to facilitate the parameter estimation of considered models; (b) defining a validation methodology and developing the related algorithms (this is complemented by sensitivity analysis, see the following section). In both cases, analyzing the uncertainty that may arise either from the data or the underlying equations, and quantifying how these uncertainties propagate in the model, are of major importance. We will work on all those issues for the models of all the applied domains covered by STEEP.
A sensitivity analysis (SA) consists, in a nutshell, in studying how the uncertainty in the output of a model can be apportioned to different sources of uncertainty in the model inputs. It is complementary to an uncertainty analysis, which focuses on quantifying uncertainty in model output. SA's can be useful for several purposes, such as guiding model development and identifying the most influential model parameters and critical data items. Identifying influential model parameters may help in divising metamodels (or, surrogate models) that approximate an original model and may be simulated, calibrated, or analyzed more efficiently. As for detecting critical data items, this may indicate for which type of data more effort must be spent in the data collection process in order to eventually improve the model's reliability. Finally, SA can be used as one means for validating models, together with validation based on historical data (or, put simply, using training and test data) and validation of model parameters and outputs by experts in the respective application area. All these uses of SA will be considered in our research.
The first two applications of SA are linked to model calibration, discussed in the previous section. Indeed, prior to the development of the calibration tools, one important step is to select the significant or sensitive parameters and to evaluate the robustness of the calibration results with respect to data noise (stability studies). This may be performed through a global sensitivity analysis, e.g. by computation of Sobol's indices. Many problems will have to be circumvented e.g. difficulties arising from dependencies of input variables, variables that obey a spatial organization, or switch inputs. We will take up on current work in the statistics community on SA for these difficult cases.
As for the third application of SA, model validation, a preliminary task bears on the propagation of uncertainties. Identifying the sources of uncertainties and their nature is crucial to propagate them via Monte Carlo techniques. To make a Monte Carlo approach computationally feasible, it is necessary to develop specific metamodels. Both the identification of the uncertainties and their propagation require a detailed knowledge of the data collection process; these are mandatory steps before a validation procedure based on SA can be implemented. First, we will focus on validating LUTI models, starting with the CITiES ANR project: here, an SA consists in defining various land use policies and transportation scenarios and in using these scenarios to test the integrated land use and transportation model. Current approaches for validation by SA consider several scenarios and propose various indicators to measure the simulated changes. We will work towards using sensitivity indices based on functional analysis of variance, which will allow us to compare the influence of various inputs on the indicators. For example it will allow the comparison of the influences of transportation and land use policies on several indicators.
In the context described in the previous sections, we can distinguish two connected and complementary strategies for analyzing environmental pressures: a sectorial approach and a spatial one. The first one is more directly connected to ecological accounting, the second one has more direct relations to urban economy and land cover modelling. Let us start by describing the former.
One of the major issues in the assessment of the long-term sustainability of urban areas is related to the concept of “imported sustainability”. Cities bring in from the outside most of their material and energy resources, and reject to the outside the waste produced by their activity. The modern era has seen a dramatic increase in both volume and variety of these material flows and consumption as well as in distance of origin and destination of these flows, usually accompanied by a spectacular increase in the associated environmental impacts. A realistic assessment of the sustainability of urban areas requires to quantify both local and distant environmental impacts; greenhouse gas emissions are only one aspect of this question. Such an assessment brings to light the most relevant direct and indirect lines of action on these issues. In this respect, it is useful to introduce the alternative concepts of consumer versus producer responsibility (or point of view).
The producer point of view is the most useful to pinpoint relevant direct lines of actions on environmental pressures due to production. In other respects, any territory imports and exports goods and services from and to the rest of the world. The consumer point of view provides information on the indirect pressures associated with these exchanges, as production responds to a final demand. Tracking the various supply chains through the analysis of the structure of the local economy and its relations and dependencies to the external world allows us to identify critically important contributions to environmental pressures; this also enables us to define fair environmental indicators in order not to attribute environmental pressures to producers only (whose responsibility is the easier to quantify of the two). In this approach, the producer responsibility follows directly from the measurement of its energy and material uses, while the consumer responsibility is established indirectly through an allocation of the impacts of production to the final consumers, but this second mode of allocation is to some extent virtual and partly subjective. Four methods stand out:
Material Flow Analysis (MFA)
Input-Output Analysis (IOA)
Life-Cycle Analysis (LCA)
Ecological Footprint (EF)
Each of these is based on a well-defined structuring element: mass conservation for MFA, measure of industrial inter-dependencies for IOA, identification of all the steps from cradle to grave for LCA, measure of biocapacity demand for EF. The different methods have preferred areas of application. For example, EF is more relevant for analyzing primary production such as agricultural staples, wood, etc. IOA is more focused on whole industrial sectors, while LCA is geared towards end-user products, taken as functional units; finally, primary materials (such as metals), waste and emissions are more easily characterized through MFA. Methodological choices are driven by the type of question one needs to address, data availability and collection method and the spatial scales under consideration. Indeed, data can be used in two different ways: bottom-up or top-down. The bottom-up data is more precise, but in general precludes comprehensiveness; on the contrary, the top-down data is by nature more comprehensive, but is not suited for a detailed, fine-scale analysis of the results.
STEEP is pursuing its research program on this theme with three major goals: 1) Creating a comprehensive database enabling pressure analyses; 2) Developing methodologies and models resolving scaling issues, and developing algorithms allowing us to rigorously and automatically obtain adequate assessments; 3) Providing a synthetic analysis of environmental pressures associated to the major material flows, at various geographic levels (employment catchment area, département and région, for France), with the explicit aim of incorporating this type of information in the public decision process on environmental issues, via specifically designed decision-help procedures.
The preceding section was focused on territorial metabolism, in particular on the analysis of supply chains. Here territories are examined with a more prominent emphasis on their spatial dimension, with attention to: the spatial distribution of local pressures previously identified (from a land use point of view), and the modeling of future land use and activity location (from an economic point of view). These two questions correspond to very different modeling strategies: the first one is more statistical in nature, extrapolating future land use from past evolution combined with global territory scenarios; the other one has a more fundamental flavor and focuses on an understanding of the processes driving urbanization. For this, we focus more precisely on the question of household and businesses choices of localization, as well as on spatial fluxes within the territory (transportation of goods and persons). The critical point here is to understand and manage urban sprawl and its environmental effects (GHG emission, loss of arable land, ecosystem fragmentation, and so on).
LUCC models are mostly used in environmental sciences, e.g. to evaluate the impact of climate change on agriculture, but they can also be used to analyze urban sprawl. There is a variety of models, static or dynamic, grid- or agent- based, local or global, etc., and with varying degrees of sophistication concerning spatio- temporal analysis or decision structures incorporated in the model.
The models of interest here are statistical in nature but spatially explicit. Following decades of development, they are robust, versatile and mature. In principle, agent-models have a larger potential for representing decision processes, but in practice this advantage results in a loss of universality of the models. Among the most well-known and most mature models, one can mention the CLUE family of models, DINAMIC, or LCM (Land Change Modeler. These models are well described in the literature, and will only be briefly presented here.
These models analyze change in land use in a statistical way; they are structured around three different modules:
The first module determines the probability of change of pixels of the territory (pixels are typically tens to hundreds of meters in size).
The second module defines the global changes between the various land uses of interest per time step (usually, a few years), based on global scenarios of evolution of the territory under study. These first two modules are independent of one another.
The last module distributes changes of land use in an explicit manner, pixel per pixel, at each time step, on the basis of the information provided by the first two modules.
Probabilities of change are calibrated on past evolution, from the differences between two past maps of land use in the more favorable cases, or from a single map otherwise (under the assumption that the logic of occupation changes is the same as the logic of land use at this single date). Such changes are then characterized in a statistical way with the help of modeling variables identified by the modeler as having potential explaining or structuring power (typically, a few to a dozen variables are used for one type of land use change). For example, in the case of urban sprawl, typical explaining factors are the distance to existing urbanized zones or distances to roads and other means of transportation, elements of real estate costs, etc. Global scenarios are quantified in terms of global changes in land use over the whole studied area (e.g., how many hectares are transformed from agricultural to urban uses in a given number of years, how does this evolve over time...); this is done either from academic expert knowledge, or from information provided by local planning agencies. Whenever feasible, models are validated by comparing the model predictions with actual evolution at a later date. Therefore, such models need from one to three land use maps at different dates for calibration and validation purposes (the larger the number of maps, the more robust and accurate the model). A large array of statistical tools is available in the literature to perform the calibration and validation of the model.
The horizon of projections of such models is limited in time, typically 20-30 years, due to the inherent uncertainty in such models, although they are occasionally used on longer time-scales. Climate change constraints are included, when needed, through scenarios, as it is not in the scope of such models to incorporate ecological processes that may translate climate change constraints into land cover change dynamics. Note that on such short time-scales, climate change is not dominated by the mean climate evolution but by decade variations which average out on longer time-scales and are not modeled in the global climate models used e.g. for IPCC projections for the end of the century; as a consequence, the various IPCC climate scenarios cannot be distinguished on such a short time horizon.
With regard to LUCC, the STEEP team has been involved for four years in the ESNET project whose funding came to a close in July of 2016, but the scientific production of the project is still underway. This project bears on the characterization of local Ecosystem Services networks; the project has been coordinated by LECA (Laboratoire d'Ecologie Alpine), in collaboration with a number of other research laboratories (most notably, IRSTEA Grenoble, besides our team), and in close interaction with a panel of local stakeholders; the scale of interest is typically a landscape (in the ecologic/geographic sense, i.e., a zone a few kilometers to a few tens of kilometers wide). The project aims at developing a generic modelling framework of ecosystem services, and studying their behavior under various scenarios of coupled urban/environment evolution, at the 2030/2040 horizon, under constraints of climate change. The contribution of the STEEP team is centered on the Land Use/Land Cover Change (LUCC) model that will be one of the major building blocks of the whole project modelling effort, with the help of an ESNET funded post-doctoral researcher. In the process, areas of conceptual and methodological improvements of statistical LUCC models have been identified; implementing these improvements will be useful for the LUCC community at large, independently of the ESNET project needs.
Urban transport systems are intricately linked to urban structure and activities, i.e., to land use. Urbanization generally implies an increased travel demand. Cities have traditionally met this additional demand by extending transportation supply, through new highways and transit lines. In turn, an improvement of the accessibility of ever-farther land leads to an expansion of urban development, resulting in a significant feedback loop between transportation infrastructure and land use, one of the main causes of urban sprawl.
Transportation models allow us to address questions generally limited to the impacts of new infrastructures, tolls and other legislation on traffic regulation
State of the art and operability of LUTI models. The first model that proved able to analyze the interactions between transport and urbanization was developed by Lowry. Since then theories and models have become increasingly complex over time. They can be classified according to different criteria. A first classification retraces the historic path of these theories and models. They can be associated with one or several of the approaches underlying all present theories: economic base theory and gravity models, Input/Output models and theory of urban rent, and micro-simulations. A second possibility consists in classifying the models according to their aims and means.
Significant scientific progress has been made over the last thirty years. Nevertheless, modelling tools remain largely restricted to the academic world. Today, only seven models have at least had one recent application outside academia or are commercialized or potentially marketable, in spite of the important needs expressed by the urban planning agencies: Cube Land, DELTA, MARS, OPUS/UrbanSim, PECAS, TRANUS and Pirandello.
To guide their choice of a modelling framework, users can rely on various criteria such as the strength of the theoretical framework, the quality and the diversity of the available documentation, the accessibility of the models (is the model freely available? is the code open source? is the software regularly updated and compatible with the recent operating systems?), the functionality and friendliness of user interfaces (existence of graphic user interface, possibility of interfacing with Geographic Information Systems), existence of technical assistance, volume and availability of the data required to implement the model, etc. For example, among the seven models mentioned above, only two are open source and mature enough to meet professional standards: TRANUS and UrbanSim
STEEP implication in LUTI modelling.
As yet, very few local planning authorities make use of these strategic models, mostly because they are difficult to calibrate and validate. Systematic improvement on these two critical steps would clearly increase the level of confidence in their results; these limitations hinder their dissemination in local agencies. One of the major goals of STEEP is therefore to meet the need for better calibration and validation strategies and algorithms. This research agenda lies at the core of our project CITiES (ANR Modèles Numériques). As for LUTI modeling, we have been using the TRANUS model since the creation of our team. We have also been working on UrbanSim from the beginning of the CITiES project. In this framework we work in close collaboration with AURG
This year represents an important landmark in the life of the team, who witnessed the first PhD defense since it has been formed.
The thesis of Jean-Yves Courtonne beared on ecological accounting, with the inception and implementation of a new downscaling method allowing us to track material flow through supply chains at various nested geographical scales; the method also provides an assessment of the associated environmental pressures and an analysis of the errors of the process. This thesis has been recognized by the two referees as a major step forward in France in this field. Four articles have come out of this work; they are published or considered for publication in the leading journals in the field.
A second PhD defense took place this year, by Laurent Gilquin who did most of his PhD studies in STEEP before he followed his supervisor (E. Arnaud) to the AIRSEA project-team.
Functional Description
Databases, database handling tools and data visualization tools (on the website). Databases include socio-economic and environmental datasets. Visualization tools include interactive piecharts, maps and Sankey diagrams.
Participants: Jean-Yves Courtonne and Pierre-Yves Longaretti
Contact: Jean-Yves Courtonne
Functional Description
This software contains two interfaces dedicated to facilitating the usage of the TRANUS integrated land use and transport model+software. The first interface is dedicated to enabling the execution of the TRANUS binary programs without the need to use the console or the TRANUS GUI. The second interface provides an aid for calibrating a TRANUS model, by interactively exploring ranges of different parameters of a TRANUS model and visualising model outputs across these ranges.
Participants: Peter Sturm, Julien Armand and Thomas Capelle
Contact: Peter Sturm
Land Use Mix calculation from OpenStreepMap data
Functional Description
The software uses Mapzen Metro Extracts to retrieve the OpenStreetMap data of a given region in the PostgreSQL format. Afterwards, a continuous representation of residential and activity land uses is created. Finally, a GIS output containing the degree of land use mixture is calculated by means of using the land uses maps. The implemented approach is documented in .
Participants: Martís Bosch Padros, Luciano Gervasoni, Serge Fenet and Peter Sturm
Partners: EPFL - Ecole Polytechnique Fédérale de Lausanne - LIRIS
Contact: Peter Sturm
Functional Description
This software allows to graphically visualise data output by the TRANUS LUTI model (and possibly, of any other data of the same structure). In particular, this concerns any data items defined per zone of a modelled territory (productions, indicators, etc.). The software is designed as a plugin for the geographical information system platform QGIS and can be run interactively as well as by the command line or by a call from within another software. The interactive mode (within QGIS) allows the user to define graphical outputs to be generated from TRANUS output files (type of graphs to be generated – 2D or 3D – color coding to be used, choice of data to be displayed, etc.). Visualisation of data is done in the form of 2D graphs or 3D models defined using java-script.
Participants: Patricio Inzaghi, Peter Sturm, Huu Phuoc Nguyen, Fausto Lo Feudo and Thomas Capelle
Contact: Peter Sturm
REDuction Of EMission
Functional Description
REDEM soft is a tool designed for the benchmarking of national GHG emission reduction trajectories. The actual version of the software is implemented in Visual Basic under Microsoft Excel in order to facilitate handling and diffusion to climate/energy economists.
Participants: Emmanuel Prados, Patrick Criqui, Constantin Ilasca, Olivier Boucher and Hélène Benveniste
Partners: EDDEN - IPSL
Contact: Emmanuel Prados
REDEM Web is a reimplementation of the REDEM soft as a web application. The main library which contains the code corresponding to REDEM model is written in Python and the web part uses Javascript.
Keywords: Benchmarking - Climate change - Global warming - Greenhouse gas emissions
Participants: Emmanuel Prados, Patrick Criqui, Constantin Ilasca, Olivier Boucher, Hélène Benveniste and Nicolas Assouad
Partners: EDDEN - UPMC
Contact: Emmanuel Prados
Scientific Description
The software is structured in three different modules:
the database module stores all the input-output data coming from Eurostat, OCDE, Insee or other sources.
the computation module performs the input-output calculations
the visualization module displays the results in a synthetic manner.
The database module is based on the SQlite format and makes use of SQL to manipulate the various tables involved in the process. The goal of this module is to provide a normalized data interface for the computation module, from various types of input-output data which are often stored as Excel sheet on web sites.
Functional Description
The purpose of this software ans website is to automatize most of the work of standard input-output analysis and to visualize the resultes in a user-friendly way in order to efficently adress related key environmental questions.
Participants: Julien Alapetite and Jean-Yves Courtonne
Contact: Jean-Yves Courtonne
Besides the publication of the article on environmental pressures in supply chains in the leading journal in the field (Journal of Industrial Ecology), the most important result obtained on this front this year bears on the quantification of the errors associated with the national road freight transport database (SITRAM). This database is informed year by year through a dedicated sampling campaign, but the errors associated with the various types of material goods transported have never been quantified. This was achieved by our team through the use of appropriate error estimators. This result is eagerly awaited by a number of scientific teeams and public territorial agencies. Furthermore, the methodology that we have developed can easily be transposed to other countries. This result constitutes an important piece in the overall effort that the team has devoted to the question of the quantification of uncertainties in material flow analyses.
In the case of ecosystem invasions, human-mediated dispersal often acts as a vector for many exotic species, both at the introduction and secondary spread stages. The introduction stage is mainly a consequence of human-mediated long distance dispersal and is known to happen at continental or global scales. Secondary spread, however occurs at smaller spatial and time scales (e.g. landscape), and can result from natural or human-mediated dispersal. Despite the importance of local goods and materials transportation (e.g. for landscaping, construction, or road-building) potentially promoting the spreading of invasive species, few studies have investigated short distance human-mediated dispersal. This lack of consideration seems to be the consequence of multiple factors:
human-mediated dispersal is generally considered as a long distance dispersal process, more important for invasive species introduction than for secondary spread;
it is difficult to qualify and quantify this mode of dispersal because of the multiplicity of potentially involved human activities;
for organisms that can disperse naturally, it is complicated to distinguish between natural and human-mediated dispersal, as they may occur at similar scales.
Even though a range of methodologies are available for describing population spread by natural dispersal, only few models have been developed to describe and predict human-mediated dispersal consequences at small scales, and none of them take into account the topology of the transport infrastructure (roads, waterways). In this result, and in order to fill this gap and provide new insights into how invasion dynamics impact ecosystem services, we combined ecological (invasive species occurrence data) and geographical (transportationnetwork topology) data in a computer model to provide estimated frequencies and distances of materials transportations through the landscape. In this study (cf. ), we investigated the spreading pattern of Lasius neglectus, an invasive ant species originating from Turkey, which spread into Europe in the last decades. In this species, no mating or dispersal flights are performed, and its spread is therefore solely ensured by the transport of soil materials in which individuals are present. We built a numerical model enabling the estimation of multiple human-mediated dispersal parameters based on ground-truth sampling and a priori minimizing. After having built a model of the landscape-level spreading process that takes explicitly into account the topology of the road network, we localized the most probable sites of introduction, the number of jump events, as well as parameters of jump distances linked to the road network. Our model was also able to compute presence probability map, and can be used to calibrate sampling campaigns, explore invasion scenarios, and more generally perform invasion spread predictions. It could be applied to all the species that can be disseminated at local to regional scales by human activities through transportation networks.
The number of people living in cities has been increasing considerably since 1950, from 746 million to 3.9 billion in 2014, and more than 66% of the world’s population are projected to live in urban areas by 2050. As this continuing population growth and urbanization are projected to add 2.5 billion people to the world’s urban population in 30 years, this situation brings new challenges on how to conceive cities that host such amounts of population in a sustainable way. This sustainability question should address several aspects, ranging from economical to social and environmental matters among others. In this work, we focus on the formalization of a measure of mixed use development or land use mix in a city, i.e. how the structure of the city can help to provide a car-free sustainable living. Such type of land use mix has been largely proven to contain beneficial outcomes in terms of sustainability and to positively contribute to societal outcome, health, and public transportation among others. We developped a framework to compute mixed uses development index. A main caracteristic of our approach is to use only crowd-sourcing data (from OpenStreetMap) to extract the geo-localized land uses. Due to the universality of this data source, we are able to process any geographical area in the world, as long as sufficient data are available in OSM. A Kernel Density Estimation is performed for each of the land uses, outputing the spatial distribution of the different land uses. Based on this representation, a measure of land use mix is then calculated using the Entropy Index. The resulting GIS output shows enriched information for urban planners, supporting and aiding the decision-making procedure.
The framework, still in the phase of validation, was applied on the cities of London and Grenoble . Future work includes integrating the LUM output for measuring the urban sprawl phenomenon and performing numerical interpretations of desirable mixed use values. We will also study the potential integration to transportation models, where land use mix correlation with the activities and residential uses can help to improve demand estimation. In addition, further investigation can be done by means of analyzing in detail the different types of activities. Finally, the estimation of LUM can be refined by taking into account, besides their location, the accessibility between different land uses, which is partly conditioned by the transportation infrastructure.
This year, we have consolidated our previous works on calibration of LUTI models, in particular of the Tranus model . The developed approaches are currently applied to instantiate a complete Tranus model for the Grenoble catchment area, in collaboration with AURG (Urban Planning Agency of the Grenoble area) and Brian Morton (U North Carolina).
We have also collaborated with the AIRSEA project-team towards applying novel sensitivity analysis tools to study the influence of the different parameter sets of a Tranus model . The rationale is to then apply optimization methods to the most influential parameters. As a result, we were able to calibrate a real-life Tranus model such that results were of higher quality than with the baseline ad hoc approach, while reducing calibration time significantly.
The PhD thesis of Jean-Yves Courtonne has been co-sponsored by ARTELIA and Inria, via a bilateral contract.
The design of our LUTI model of Grenoble based on TRANUS platform takes place in the framework of a tight collaboration with the AURG, the Urban Planning Agency of the Grenoble area.
(Calibrage et valIdation de modèles Transport - usagE des Sols)
Program: “Modèles Numériques” 2012, ANR
Duration: 2013 – 2016
Coordinator: Emmanuel Prados (STEEP)
Other partners: LET, IDDRI, IRTES-SET (“Systemes and Transports” lab of Univ. of Tech. of Belfort-Montbéliard), IFSTTAR-DEST Paris (formerly INRETS), LVMT (“Laboratoire Ville Mobilité Transport”, Marne la Vallée), VINCI (Pirandello Ingenierie, Paris), IAU Île-De-France (Urban Agency of Paris), AURG (Urban Agency of Grenoble), MOISE (Inria project-team)
Abstract: Calibration and validation of transport and land use models.
(Futures of ecosystem services networks for the Grenoble region)
Program: “Modeling and Scenarios of Biodiversity" flagship program, Fondation pour la Recherche sur la Biodiversité (FRB). This project is funded by ONEMA (Office National de l'Eau et des Milieux Aquatiques).
Duration: 2013 – 2016
Coordinator: Sandra Lavorel (LECA)
Other partners: EDDEN (UPMF/CNRS), IRSTEA Grenoble (formerly CEMAGREF), PACTE (UJF/CNRS), ERIC (Lyon 2/CNRS)
Abstract: This project explores alternative futures of ecosystem services under combined scenarios of land-use and climate change for the Grenoble urban area in the French Alps. In this project, STEEP works in particular on the modeling of the land use and land cover changes, and to a smaller extent on the interaction of these changes with some specific services.
Songyou Peng (summer internship, MSc student in the ViBOT Erasmus Mundus program).
Association for the Advancement of Artificial Intelligence (AAAI) 2016 Computational Sustainability and AI (S. Fenet)
International Conference on Principles and Practices of Constraint Programming (CP) 2016 Computational Sustainability track (S. Fenet)
CompSust@CP-2016 (the 22nd International Conference on Principles and Practice of Constraint Programming), Toulouse, France, September 2016 (E. Prados)
Journées Scientifiques Inria – Inria Science Days (P. Sturm)
German Conference on Pattern Recognition (P. Sturm)
EVOSTAR 2016 (S. Fenet)
E. Prados has been invited by the Wimmics team to give a seminar at Inria Sophia-Antipolis (Conference title : “Comment réduire l’impact d’un éventuel effondrement ? Comment construire le monde d’après ?”, Sophia-Antipolis, France, 10th of November, 2016).
E. Prados gave a seminar at the Conseil Scientifique de l’AURG (Conference title : “Modèle TRANUS d’usage des sols et transport pour l’agglomération grenobloise”, Grenoble, France, 6th of July, 2016).
P. Sturm gave an invited seminar at the Czech Technical University (Prague, Czech Republic), on former work in computer vision.
P. Sturm: Expert for the European Eureka/Eurostars program and for the regional ARC6 program.
P. Sturm is Deputy Scientific Director of Inria, in charge of domain “Perception, Cognition and Interaction”.
E. Prados and P. Sturm organized the Inria – FING partnership. The FING (Fondation Internet Nouvelle Génération) is a think tank working on socio-economic changes inspired by technology and its uses. In 2016, they organized two workshops in the framework of the project “Transition
PhD: Jean-Yves Courtonne, Evaluation environnementale de territoires à travers l’analyse de filières - La comptabilité biophysique pour l’aide à la décision délibérative, Grenoble University, 28/06/2016, D. Dupré and P.Y. Longaretti
PhD: Laurent Gilquin, Echantillonnages Monte Carlo et quasi-Monte Carlo pour l'estimation des indices de Sobol'. Application à un modèle transport-urbanisme, Grenoble University, 17/10/2016, Elise Arnaud and C. Prieur
PhD in progress: Michela Bevione, Sustainability and territorial energy transition: coupling supply chains with LCA, 11/2016, N. Buclet and P.Y. Longaretti
PhD in progress: Thomas Capelle, Development of optimisation methods for land-use and transportation models, 10/2013, P. Sturm and A. Vidard
PhD in progress: Luciano Gervasoni, Modeling the dynamics of urban sprawl, 10/2015, S. Fenet and P. Sturm
PhD in progress: Julien Salotti, Spatio-temporal analysis of traffic data for smart mobility, 11/2014, S. Fenet, C. Solnon and N.-E. El Faouzi
PhD in progress: Lucas Foulon, Detection of anomalies in real-time ground–on board flows of the SNCF, 12/2016, S. Fenet, C. Rigotti and D. Jouvin
P. Sturm was reviewer of the PhD thesis of Liming Yang, Ecole Centrale de Nantes.
P. Sturm was reviewer of the PhD thesis of Jan Heller, Czech Technical University, Prague.
STEEP team has organized a series of conferences entitled “Understanding and Acting” (six conferences in 2016). The conferences has been filmed, edited and posted on the video-sharing website, YouTube. A total of more than 500 people have attended to the conferences. The conferences received more than 2000 YouTube views (December 2016).
https://
https://
Emmanuel Prados gave a “conférence-débat” at “Marie Reynoard” High school on “Sustainable development, territorial governance and democracy” (Villard-Bonnot, France, 16th of December, 2016).
Emmanuel Prados gave a conference at the “Café In” of Inria Sophia-Antipolis entitled “Comprendre les phénomènes d’effondrement de sociétés. Quel avenir pour la nôtre ?” (Sophia-Antipolis, France, 10th of November, 2016).
Emmanuel Prados has organized a workshop in collaboration with the EP SCOT Grenoble on the land-use and transport modelling and on its potential application to the follow up of the Grenoble' SCOT project (Grenoble, France, 14th of October, 2016). Representatives of all the main political and administrative authorities of Grenoble area attended to this workshop (Région Rhône-Alpes, Département of Isère, Grenoble-Alpes Métropole, AURG, municipalities communities of Grenoble employment catchment areas).