The project-team is particularly active in the following areas:

classical theory of dynamical systems

optimal deterministic, stochastic and robust control

failure detection in dynamical systems (both passive and active)

network control and monitoring for transportation systems

hybrid systems, in particular the development of Scicos

maxplus linear systems: applications to transportation systems

numerical matrix algebra and its implementation in ScicosLab

numerical algorithms.

The objectives of the project-team are the design, analysis and development of new methods and algorithms for detection, identification, simulation and control of dynamical systems and their software implementations.

These methods and algorithms are implemented in Scilab and ScicosLab which are a scientific software packages originally developed in the project-team.

The project-team is actively involved in the development of control, signal processing, optimization and simulation tools, in particular Scicos, a modeler and simulator for dynamical systems developed based on research on hybrid systems. Encouraged by the interest in Scicos, expressed both by the academia and industry, developing a robust user-friendly Scicos has become an important objective of the project-team. A lot of effort is put into the development of Scicos within the project-team.

As theory and applications enrich mutually, many of the objectives of the project-team can be seen through the applications:

modeling and simulation of physical systems (mechanical, electrical, fluids, thermodynamics,...) based on the theory of implicit systems

modeling, simulation and code generation of control systems based on the theory of hybrid systems

modeling, analysis and control of transportation systems using the maxplus algebra

using robust control theory, and finite element models for identification purposes in the framework of failure detection and default localization for space systems, civil structures and other dynamical systems.

Systems, control and signal processing constitute the main foundations of the research work of the project-team. We have been particularly interested in numerical and algorithmic aspects. This research which has been the driving force behind the creation of Scilab has nourished this software over the years thanks to which, today, Scilab and now ScicosLab contain most of the modern tools in control and signal processing. ScicosLab is a vehicle by which theoretical results of the project-team concerning areas such as classical, modern and robust control, signal processing and optimization, is made available to industry and academia.

Ties between this fundamental research and ScicosLab are very strong. Indeed, even the design of the software itself, elementary functions and data structures are heavily influenced by the results of this research. For example, even elementary operations such as basic manipulation of polynomial fractions have been implemented using a generalization of the the state-space theory developed as part of our research on implicit systems. These ties are of course normal since Scilab has been primarily developed for applications in automatics.

Scilab has created for our research team new contacts with engineers in industry and other research groups. Being used in real applications, it has provided a guide for choosing new research directions. For example, we have developed the robust control tools in collaboration with industrial users. Similarly for the LMI toolbox, which we have developed with the help of other research groups. It should also be noted that most of the basic systems and control functions are based on algorithms developed in the European research project Slicot in which METALAU has taken part.

Implicit systems are a natural framework for modeling physical phenomena. We work on theoretical and practical problems associated with such systems in particular in applications such as failure detection and dynamical system modeling and simulation.

Constructing complex models of dynamical systems by interconnecting elementary components leads very often to implicit systems. An implicit dynamical system is one where the equations representing the behavior of the system are of the algebraic-differential type. If represent the “state” of the system, an implicit system is often described as follows:

where
is the time derivative of
,
tis the time and the vector
zcontains the external variables (inputs and outputs) of the system. Indeed it is an important property of implicit systems that outside variables interacting with the system need not
be characterized a priori as inputs or outputs, as it is the case with explicit dynamical systems. For example if we model a capacitor in an electrical circuit as a dynamical system, it would
not be possible to label a-priori the external variables, in this case the currents and voltages associated with the capacitor, as inputs and outputs. The physical laws governing the
capacitor simply impose dynamical constraints on these variables. Depending on the configuration of the circuit, it is sometimes possible to specify some external variables as inputs and the
rest as outputs (and thus make the system explicit) however in doing so system structure and modularity is often lost. That is why, usually, even if an implicit system can be converted into
an explicit system, it is more advantages to keep the implicit model.

It turns out that many of the methods developed for the analysis and synthesis of control systems modeled as explicit systems can be extended to implicit systems. In fact, in many cases, these methods are more naturally derived in this more general setting and allows for a deeper understanding of the existing theory. In the past few years, we have studied a number of systems and control problems in the implicit framework.

For example in the linear discrete time case, we have revisited classical problems such as observer design, Kalman filtering, residual generation to extend them to the implicit case or have used techniques from implicit system theory to derive more direct and efficient design methods. Another area where implicit system theory has been used is failure detection. In particular in the multi-model approach where implicit systems arise naturally from combining multiple explicit models.

We have also done work on nonlinear implicit systems. For example nonlinear implicit system theory has been used to develop a predictive control system and a novel nonlinear observer design methodology. Research on nonlinear implicit systems continues in particular because of the development of the “implicit” version of Scicos.

Failure detection has been the subject of many studies in the past. Most of these works are concerned with the problem of
*passive failure detection*. In the passive approach, for material or security reasons, the detector has no way of acting upon the system; the detector can only monitor the inputs and
the outputs of the system and then decides whether, and if possible what kind of, a failure has occurred. This is done by comparing the measured input-output behavior of the system with the
“normal” behavior of the system. The passive approach is often used to continuously monitor the system although it can also be used to make periodic checks.

In some situations however failures can be masked by the operation of the system. This often happens in controlled systems. The reason for this is that the purpose of controllers, in general, is to keep the system at some equilibrium point even if the behavior of the system changes. This robustness property, desired in control systems, tends to mask abnormal behaviors of the systems. This makes the task of failure detection difficult. An example of this effect is the well known fact that it is harder for a driver to detect an under-inflated or flat front tire in a car which is equipped with power steering. This tradeoff between detection performance and controller robustness has been noted in the literature and has lead to the study of the integrated design of controller and detector.

But the problem of failures being masked by system operation is not limited to controlled systems. Some failures may simply remain hidden under certain operating conditions and show up only under special circumstances. For example, a failure in the brake system of a truck is very difficult to detect as long as the truck is cruising down the road on level ground. It is for this reason that on many roads, just before steep downhill stretches, there are signs asking truck drivers to test their brakes. A driver who ignores these signs would find out about a brake failure only when he needs to brake going down hill, i.e., too late.

An alternative to passive detection which could avoid the problem of failures being masked by system operation is
*active detection*. The active approach to failure detection consists in acting upon the system on a periodic basis or at critical times using a test signal in order to detect abnormal
behaviors which would otherwise remain undetected during normal operation. The detector in an active approach can act either by taking over the usual inputs of the system or through a special
input channel. An example of using the existing input channels is testing the brakes by stepping on the brake pedal.

The active detection problem has been less studied than the passive detection problem. The idea of injecting a signal into the system for identification purposes has been widely used. But the use of extra input signals in the context of failure detection has only been recently introduced.

The specificity of our approach for solving the problem of auxiliary signal design is that we have adopted a deterministic point of view in which we model uncertainty using newly developed
techniques from
control theory. In doing so, we can deal efficiently with the robustness issue which is in general not properly dealt with in stochastic approaches to this problem. This has allowed us
in particular to introduce the notion of
*guaranteed failure detection*.

In the active failure detection method considered an auxiliary signal
vis injected into the system to facilitate detection; it can be part or all of the system inputs. The signal
udenotes the remaining inputs measured on-line just as the outputs
yare measured online. In some applications the time trajectory of
umay be known in advance but in general the information regarding
uis obtained through sensor data in the same way that it is done for the output
y.

Suppose we have only one possible type of failure. Then we have two sets of input-output behaviors to consider and hence two models. The set
is the set of normal input-outputs
{
u,
y}from Model 0 and the set
is the set of input-outputs when failure occurs. That is,
is from Model 1. These sets represent possible/likely input-output trajectories for each model. Note that Model 0 and Model 1 can differ greatly in size and complexity but they have in
common
uand
y.

The problem of auxiliary signal design for guaranteed failure detection is to find a “reasonable”
vsuch that

That is, any observed pair
{
u,
y}must come only from one of the two models. Here reasonable
vmeans a
vthat does not perturb the normal operation of the system too much during the test period. This means, in general, a
vof small energy applied over a short test period. However, depending on the application, “reasonable" can imply more complicated criteria.

Depending on how uncertainties are accounted for in the models, the mathematics needed to solve the problem can be very different. For example guaranteed failure detection has been first introduced in the case where unknown bounded parameters were used to model uncertainties. This lead to solution techniques based on linear programming algorithms. But in most of our works, we consider the types of uncertainties used in robust control theory. This has allowed us to develop a methodology based on established tools such as Riccati equations that allow us to handle very large multivariable systems. The methodology we develop for the construction of the optimal auxiliary signal and its associated test can be implemented easily in computational environments such as Scilab. Moreover, the online detection test that we obtain is similar to some existing tests based on Kalman filters and is easy to implement in real-time. The main results of our research can be found in a book published in 2004. We have developed many extension since, which have been published in various journals and presented at conferences.

We consider mechanical systems with the corresponding stochastic state-space models of automatic control.

The mechanical system is assumed to be a time-invariant linear dynamical system:

where the variables are :
: displacements of the degrees of freedom,
M,
C,
K: mass, damping, stiffness matrices,
t: continuous time;
: vector of external (non measured) forces modeled as a non-stationary white noise;
L: observation matrix giving the observation
Y(corresponding to the locations of the sensors on the structure).

The modal characteristics are: the vibration modes or eigen-frequencies and the modal shapes or eigenvectors. They satisfy:

By stacking
Zand
and sampling at rate
1/
, i.e.,

we get the following equivalent state-space model:

with

The mechanical systems under consideration are vibrating structures and the numerical simulation is done by the finite element model.

The objectives are the analysis and the implementation of statistical model-based algorithms, for modal identification, monitoring and (modal and physical) diagnosis of such structures.

For modal analysis and monitoring, the approach is based on subspace methods using the covariances of the observations: that means that all the algorithms are designed for in-operation situation, i.e., without any measurement or control on the input (the situation where both input and output are measured is a simple special case).

The identification procedure is realized on the healthy structure.

The second part of the work is to determine, given new data after an operating period with the structure, if some changes have occurred on the modal characteristics.

In case there are changes, we want to find the most likely localization of the defaults on the structure. For this purpose we have to do the matching of the identified modal characteristics of the healthy structure with those of the finite element model. By use of the different Jacobian matrices and clustering algorithms we try to get clusters on the elements with the corresponding value of the "default criterion".

This work is done in collaboration with the INRIA-IRISA project-team SISTHEM (a spin-off of the project-team SIGMA2) (see the web-site of this project-team for a complete presentation and bibliography) and with the project-team MACS for the physical diagnosis (on civil structures).

Failure detection problems are formulated in such a way that mathematical techniques in robust control can be used to formulate and solve the problem of robust detection. Concepts developed for control can be used in particular to formulate the notion of robustness and provide numerically tractable solutions.

This system approach can also be used to formulate both the detection and the control in a single framework. The Simultaneous Fault Detection and Control problem is formulated as a mixed optimization problem and its solution is given in terms of Riccati equations. It is shown that controllers/detectors resulting from this approach have reasonable complexity and can be used for practical applications.

Originally motivated by problems encountered in modeling and simulation of failure detection systems, the objective of this research is the development of a solid formalism for efficient modeling of hybrid dynamical systems.

A hybrid dynamical system is obtained by the interconnection of continuous time, discrete time and event driven models. Such systems are common in most control system design problems where a continuous time model of the plant is hooked up to a discrete time digital controller.

The formalism we develop here tries to extend methodologies from Synchronous languages to the hybrid context. Motivated by the work on the extension of Signal language to continuous time,
we develop a formalism in which through a generalization of the notion of event to what we call
*activation signal*, continuous time activations and event triggered activations can co-exist and interact harmoniously. This means in particular that standard operations on events such
as subsampling and conditioning are also extended and operate on activation signals in general paving the way for a uniform theory.

The theoretical formalism developed here is the backbone of the modeling and simulation software Scicos. Scicos is the place where the theory is implemented, tested and validated. But Scicos has become more than just an experimental tool for testing the theory. Scicos has been successfully used in a number of industrial projects and has shown to be a valuable tool for modeling and simulation of dynamical systems.

Encouraged by the interest in Scicos, expressed both by the academia and industry, beyond the theoretical studies necessary to ensure that the bases of the tool are solid, the project-team has started to invest considerable effort on improving its usability for real world applications. Developing a robust user-friendly Scicos has become one of the objectives of the project-team.

It turns out that the Scicos formalism and the Modelica language share many common features, and are in many respects complementary. Scicos formalism provides a solid ground for modeling discrete-time and event dynamics, in a hybrid framework, based on the theory of synchronous languages, and Modelica is a powerful language for the construction of continuous-time models. We work closely with Modelica association and other actors in the Modelica community to make sure Modelica remains consistent with Scicos. We do this in particular by proposing new discrete-time extensions to Modelica inspired by Scicos formalism.

In the modeling of human activities, in contrast to natural phenomena, quite frequently only the operations max (respectively min) and +are needed (this is the case in particular of some queuing or storage systems, synchronized processes encountered in manufacturing, traffic systems, when optimizing deterministic dynamic processes, etc.).

The set of real numbers endowed with the operation max (respectively min) denoted
and the operation
+denoted
is a nice mathematical structure that we may call an idempotent semi-field. The operation
is idempotent and has the neutral element
= -
but it is not invertible. The operation
has its usual properties and is distributive with respect to
. Based on this set of scalars we can build the counterpart of a module and write the general
(
n,
n)system of linear maxplus equations:

Axb=
Cxd,

using matrix notation where we have made the natural substitution of for +and of for ×in the definition of the matrix product.

A complete theory of such linear system is still not completely achieved. In recent development we try to have a better understanding of image and kernel of maxplus matrices.

System theory is concerned with the input (
u)-output (
y) relation of a dynamical system (
) denoted
y=
S(
u)and by the improvement of this input-output relation (based on some engineering criterium) by altering the system through a feedback control law
u=
F(
y,
v). Then the new input (
v)-output (
y) relation is defined implicitly by
y=
S(
F(
y,
v)). Not surprisingly, system theory is well developed in the particular case of linear shift-invariant systems. Similarly, a min-plus version of this theory can
also be developed.

In the case of SISO (single-input-single-output) systems,
uand
yare functions of time. In the particular case of a shift-invariant linear system,
Sbecomes an inf-convolution:

where
his a function of time called the impulse response of system
. Therefore such a system is completely defined by its impulse response. Elementary systems are combined by arranging them in parallel, series and feedback. These three engineering
operations correspond to adding systems pointwise (
), making inf-convolutions (
) and solving special linear equations (
y=
h(
f_{1}yf_{2}v)) over the set of impulse responses. Mathematically we have to study the algebra of functions endowed with the two operations
and
and to solve special classes of linear equations in this set, namely when
A=
Ein the notation of the first part.

An important class of shift-invariant min-plus linear systems is the process of counting events versus time in timed event graphs (a subclass of Petri nets frequently used to represent manufacturing systems). A dual theory based on the maxplus algebra allows the timing of events identified by their numbering.

The Fourier and Laplace transforms are important tools in automatic control and signal processing because the exponentials diagonalize simultaneously all the convolution operators. The convolutions are converted into multiplications by the Fourier transform. The Fenchel transform ( ) defined by:

plays the same role in the min-plus algebra context. The affine functions diagonalize the inf-convolution operators and we have:

A general inf-convolution is an operation too complicated to be used in practice since it involves an infinite number of operations. We have to restrict ourselves to convolutions that can
be computed with finite memory. We would like that there exists a finite state
xrepresenting the memory necessary to compute the convolution recursively. In the discrete-time case, given some
h, we have to find
(
C,
A,
B)such that
h_{n}=
CA^{n}B, and
is then `realized' as

x_{n+ 1}=
Ax_{n}Bu_{n},
y_{n}=
Cx_{n}.

SISO systems (with increasing h) which are realizable in the min-plus algebra are characterized by the existence of some
and
csuch that for
nlarge enough:

h_{n+
c}=
c×
+
h
_{n}.

If
hsatisfies this property, it is easy to find a 3-tuple
(
A,
B,
C).

This beautiful theory is difficult to apply because the class of linear systems is not large enough for realistic applications. Generalization to nonlinear maxplus systems able to model general Petri nets is under development.

Dynamic Programming in the discrete state and time case amounts to finding the shortest path in a graph. If we denote generically by
nthe number of arcs of the paths, the dynamic programming equation can be written linearly in the min-plus algebra:

X_{n}=
AX_{n-1},

where the entries of
Aare the lengths of the arcs of the graph and
X_{n}denotes the matrix of the shortest lengths of paths with
n arcs joining any pair of nodes. We can consider normalized matrices defined by the fact that the infimum in each row is equal to 0. Such kind of matrices can be viewed as the
min-plus counterpart of transition matrices of a Markov chain.

The problem

may be called dynamic programming with independent instantaneous costs (
depends only on
uand not on
x). Clearly
vsatisfies the linear min-plus equation:

(the Hamilton-Jacobi equation is a continuous version of this problem).

The Cramer transform ( ), where denotes the Laplace transform, maps probability measures to convex functions and transform convolutions into inf-convolutions:

Therefore it converts the problem of adding independent random variables into a dynamic programming problem with independent costs.

These remarks suggest the existence of a formalism analogous to probability calculus adapted to optimization that we have developed.

The theoretical research in this domain is currently done in the MAXPLUS project-team. In the METALAU project-team we are more concerned with applications to traffic systems of this theory.

Traffic modeling is a domain where maxplus algebra appears naturally : – at microscopic level where we follow the vehicles in a network of streets, – at macroscopic level where assignment are based on computing smallest length paths in a graph, – in the algebraic duality between stochastic and deterministic assignments.

We develop free computing tools and models of traffic implementing our experience on optimization and discrete event system modeling based on maxplus algebra.

Let us consider a circular road with places occupied or not by a car symbolized by a 1. The dynamic is defined by the rule
10
01that we apply simultaneously to all the parts of the word
mrepresenting the system. For example, starting with
m_{1}= 1010100101we obtain the sequence of works
(
m
_{i}):

For such a system we can call density
dthe number of cars divided by the number of places called
pthat is
d=
n/
p. We call flow
f(
t)at time
tthe number of cars at this time period divided by the place number. The fundamental traffic law gives the relation between
f(
t)and
d.

If the density is smaller than
1/2, after a transient period of time all the cars are separated and can go without interaction with the other cars. Then
f(
t) =
n/
pthat can be written as function of the density as
f(
t) =
d

On the other hand if the density is larger than
1/2, all the free places are separated after a finite amount of time and go backward freely. Then we have
p-
ncar which can go forward. Then the relation between flow and density becomes

f(
t) = (
p-
n)/
p= 1-
d.

This can be stated formally: it exists a time
Tsuch that for all
tT,
f(
t)stays equal to a constant that we call
fwith

The fundamental traffic law linking the density of vehicles and the flow of vehicles can be also derived easily from maxplus modeling : – in the deterministic case by computing the eigenvalue of a maxplus matrix, – in the stochastic case by computing a Lyapounov exponent of stochastic maxplus matrices.

The main research consists in developing extensions to systems of roads with crossings. In this case, we leave maxplus linear modeling and have to study more general dynamical systems. Nevertheless these systems can still be defined in matrix form using standard and maxplus linear algebra simultaneously.

With this point of view efficient microscopic traffic simulator can be developed in Scilab.

Given a transportation network
and a set
of transportation demands from an origin
to a destination
, the
*traffic assignment*problem consists in determining the flows
f_{a}on the arcs
of the network when the times
t_{a}spent on the arcs
aare given functions of the flows
f_{a}.

We can distinguish the deterministic case — when all the travel times are known by the users — from the stochastic cases — when the users perceive travel times different from the actual ones.

When the travel times are deterministic and do not depend on the link flows, the assignment can be reduced to compute the routes with shortest travel times for each origin-destination pair.

When the travel times are deterministic and depend on the link flows, Wardrop equilibriums are defined and computed by iterative methods based on the previous case.

When the perceived travel times do not depend on the link flows but are stochastic with error distribution — between the perceived time and the actual time — satisfying a Gumbel distribution, the probability that a user choose a particular route can be computed explicitly. This probability has a Gibbs distribution called logit in transportation literature. From this distribution the arc flows — supposed to be deterministic — can be computed using a matrix calculus which can be seen as the counterpart of the shortest path computation (of the case 1) up to the substitution of the minplus semiring by the Gibbs-Maslov semiring, where we call Gibbs-Maslov semiring the set of real numbers endowed with the following two operations :

When the perceived travel times are stochastic and depend on the link flows — supposed to be deterministic quantities — stochastic equilibriums are defined and can be computed using iterative methods based on the logit assignments discussed in the case 3.

Based on this classification, a toolbox dedicated to traffic assignment is available and maintained in Scilab.

We have used the techniques developed for modal analysis and diagnosis in many different applications: rotating machines, aircrafts, parts of cars, space launcher, civil structures. The most recent examples are:

Eureka (FLITE) project: exploitation of flight test data under natural excitation conditions.

Ariane 5 launcher: application to a ground experiment (contract with CNES and EADS Space Transportation)

Steelquake: a European benchmark for a civil structure.

Scilab has been developed by the Metalau project and ENPC (J.Ph. Chancelier) with many external contributions. Since 2003 the Scilab Consortium is in charge of the development and distribution of Scilab.

Concerning this software, the Metalau project is now involved in the knowledge transfer to the Scilab team and the maintenance and update of some toolboxes.

ScicosLab is a free environment for scientific computation similar in many respects to Matlab/Simulink, providing Matlab functionalities through Scilab 4, and, Simulink and Modelica functionalities via Scicos. ScicosLab is a GTK version of Scilab, based on the Scilab BUILD4 distribution. ScicosLab includes, in addition to the Gtk2 GUI, the maxplus built-in toolbox. Scilab and its predecessor Basile have been developed in the Metalau (formerly Meta2) project. This work has been carried out in close collaboration with J. Ph. Chancelier of ENPC who has made major contributions to Scilab such as the development of the graphics and the port to the Windows platform. The new version 4.4 of ScicosLab has been released in December 2009, in cooperation with ENPC. This release contains the latest developments made for Scicos, mostly code generation.

Scicos, a tool for modeling, simulation and code generation included in ScicosLab (
http://

Control and signal processing toolboxes

CiudadSim Scilab Traffic Assignment toolboxes

COSMAD Output modal analysis and diagnosis

MAXPLUS Maxplus arithmetic and linear systems toolbox by the Maxplus Working Group

LMITOOL optimization for robust control applications

CUTEr Scilab toolbox for testing linear algebra and optimization contribution to Scilab software

ScicosLab is a free environment for scientific computation similar in many respects to Matlab/Simulink, providing Matlab functionalities through Scilab 4, and, Simulink and Modelica functionalities via Scicos.

ScicosLab is a GTK version of Scilab, based on the Scilab BUILD4 distribution. ScicosLab includes, in addition to the Gtk2 GUI, the maxplus built-in toolbox. Scilab and its predecessor Basile have been developed in the Metalau (formerly Meta2) project. This work has been carried out in close collaboration with J. Ph. Chancelier of ENPC who has made major contributions to Scilab such as the development of the graphics and the port to the Windows platform.

ScicosLab is made available by the researchers of the Metalau team at INRIA and ENPC who originally developed Scilab. ScicosLab is used in particular for distributing software stemming
from the research activities at Metalau, such as Scicos (
http://

ScicosLab (ex-ScilabGTK) is available for most Windows, Linux and MacOSX operating systems, and can be downloaded from:
http://

The development of Scicos continues. Our development strategy consists of actively participating in R& D projects (ANR, European). We choose the projects to support long term development of Scicos.

We have released Scicos 4.4 in December of 2009. This has been a major release. Scicos 4.4 (
http://

This release of Scicos includes:

a better coverage of the Modelica language (Modelica functionalities in Scicos are developed in collaboration with LMS-Imagine)

a user interface for assistance in finding consistent initial conditions for large Modelica models

a new more efficient Scicos compiler

improved masking operations

hierarchical palette structures

major improvement in the C code generator allowing in particular code generation for a wider class of sub-systems.

Efficient XML format for load and save (to be used for model exchange with other programs)

The development of Scilab has been taken over by the Scilab team since the creation of the Scilab consortium. We continue to have an action in this domain to help the knowledge transfer to the Scilab team. This transfer is mainly about the numerical computation, the Scilab language features and implementation subtelties. Since this year it also concerns the Scicos tool.

Until this year this action of transfer was made thanks to an assignment at mid time with the team of Scilab. Since June 2008 this action continues throught a support of the Metalau team.

On an other hand we continued updating the optimization contributed toolboxes (FSQP, CUTEr, LIPSOL, Quapro, ..) with the newest solvers available. In this context we started an overall evaluation of these solvers as well as the Scilab native one using the wide CUTEr problem set.

The research work on hybrid system modeling and simulation provides the backbone of the modeling and simulation software Scicos. Modeling hybrid systems in a rigorous fashion is the objective of the Scicos formalism.

Currently the major axes of research and development on Scicos formalism are:

**Modelica.**Modelica is a programming language primarily devoted to continuous-time system modeling. We consider that the extension of the Modelica language to discrete time dynamics
consistently with the Scicos formalism would lead to a very powerful and broad modeling paradigm. We have been involved in close collaboration with Modelica Association for the
specification of the language in the spirit of Scicos formalism. We have developed a Modelica compiler with LMS-Imagine which is now included in Scicos and distributed with ScicosLab 4.3.
We work closely with EDF, IFP and PSA through various research contracts (in particular Simpa2, Eurosyslib) and we are in close contact with the OpenModelica project at Linkoping
University. This work will be supported by the European project OpenProd starting 2009. We also participate in the development of Modelica libraries for hybrid components in Modelica
(collaboration with Dassault Systems).

**Code generation.**We have made significant progress in the area of code generation. Scicos code generator now can be used in the asynchronous framework. This has been particularly
important for the integration of Modelica.

**Parade.**The ANR contract Parade, the purpose of which is the study of parallel techniques for system simulation has been . Mrs Nguyen, an ESIEE trainee has developed waveform
simulation examples within Scicos, following the preceding work of D. Chapon. Our joint work with Lagep has been followed up, and in particular a preliminary Scicos toolbox for port
Hamiltonian system has been started. An example of a transmission line has been developed using an Hamiltonian formalism, leading to discretization schemes which respect energy
invariants. The collocation method used is based on ad-hoc interpolation polynomials and the quadrature have been performed numerically (the Matlab implementation made at Lagep use
symbolic calculations). Simone Mannori has realized several examples of parallel solvers using the Sundials/Mpi environment. The purpose is to exploit the new code generator of Scicos
4.4, which is now compatible with the parallel solvers of Sundials. The Lagep adsorption column model has been simulated by a standalone Sundials solver, automatically generated.

**Applications.**A modeling paradigm must respond to real needs. That is why we continue our close collaboration with industrial partners to better understand their problems. We have
been involved in the development of a number of applications in the past few years with PSA, EDF, IFP, EADS, DGA and others.

ModNum ("MODulations NUMériques") is an open source and free computational library for the modeling and the simulation of communication systems. It proposes scicos blocks, schematics and scilab in-line functions of base-band PSK/QAM modulations in order to build communication chains in the Scilab/Scicos environment. Components used to build spread-spectrum communication systems, such Pseudo Noise sequence generators (Quasi-Chaotic, PN and Gold sequence generators) are also included. ModNum also includes miscellaneous scopes for Scicos, such as a spectrum analyzer scope and other scopes used for analysis of digital transmissions (e.g. Eye Diagram Scope, Scattered Diagram ). Schematics and blocks of integer and fractional frequency synthesizer components (e.g. Phase/Frequency Detector, VCO, Delta-Sigma modulators,...) are provided. ModNum also focuses on the simulation of chaotic systems and gives schematics of simulation of Chua's, Rössler's, Van Der Pol's systems (and others). More information on ModNum is available on ModNum web site.

New releases have been realized during that year 2009 to take advantage of the new features available in the lastest ScicosLab/Scicos.

We develop a novel theory of robust active failure detection based on multi-model formulation of failures. The results of years of research have been published in a book in 2004.

We have continued to work on the extension of our approach to more general situations. Since 2008, we have mainly worked on considering the effects of feedback on our approach. This work is carried out as part of the thesis work of A. Esna Ashari. The multi-model approach is still used to model the normal and the failed systems; however the possible advantages of using linear dynamic feedback in construction of the auxiliary signal for robust fault detection is considered and the results are compared to previously developed open-loop set-ups.

In our formulation of the active fault detection problem using feedback, we cannot use as cost criterion the norm of the auxiliary signal as it was done previously because in the feedback case the auxiliary signal depends on the noise through the feedback. So, we have formulated a more general cost function by considering the worst case scenario. This type of formulation is often used in robust control theory. We have given a complete solution to the problem.

We continue the work on active failure detection by developing more efficient algorithms to be used for very large problems.

The new results concern mainly modifications of the identification procedure to improve the robustness and the quality of the results. The domain where theoretical results are still under consideration are for more general models with influence of exogenous phenomena (noise, temperature, fluids...).

For the diagnosis the studies are focused on fast on-line detection (e.g. for flutter) and improvement of the physical diagnosis.

EFDA (European Fusion Development Agreement,
http://

Kepler is a web simulation environment not particularly focused on digital control and automatic code generation for real time embedded applications. In order to speed up the development of digital controllers and their implementation on physical units, CEA-ITM (Itegrated Tokamak Modelling) team, leaded by Sylvain Bremond, has chosen the METALAU project for the development of Scicos-ITM. Scicos-ITM allow the design and the simulation of advanced digital controllers required for plasma fusion applications. Using recent Roberto Bucher's developments, we have adapted the Scicos-RTAI code generator (derived from the Scicos built-in code generator) to ITM's requirement. Scicos-ITM is now able to produce a Kepler actor (analogous to a Scicos block/superblock) that contains a complete digital controller. Scicos-ITM automatically generate the C code used by FC2K (Fortran, C to Kepler) utility developed by CEA-ITM. We have developed some specific Scicos blocks that allow the data exchange between the Scicos and Kepler. The same code generator is also able to produce a standalone code for the implementation of controllers on physical units (embedded PPC and x86 boards connected with the Tore Supra tokamak, used as test bed platform for ITER). Scicos-ITM has been developed as ScicosLab toolbox using ScicosLab GTK 4.3 and RTAI 3.7.1 as reference codes. The source code is made available to the ITM/EFDA team on a server and locally compiled on a 64bit x86 multicore machine.

The contract will be prolonged in 2010 for further developments and migration to the updated ScicosLab versions.

The thesis of N. Farhi dedicated to maxplus modeling of microscopic traffic has been defended in June 2008 where a difficult result has been obtained on the explicit computation of an approximation of the fundamental diagram given by a generalized additive eigenvalue for two roads with a junction. This result gives a good insight to what happens on general networks of roads. In this diagram appears phases which have nice traffic interpretation. This year was dedicated to improve the proof and to publish the results in traffic journal and more mathematical oriented journal.

During this redaction and reviewing process appeared a possible extension of the main equation which is finite dimensional to the infinite dimensional case by passing to the limit on the size of the cells on the road. The eigenvalue of the HJB equation obtained in such a way can be linked with the very classical Lightill Whitam equation and its eigenvalue can be computed explicitly in the case of a circular roads. We are studying now what happens when an intersection appears. It could be an HJB equation on a manifold with a singularity for which it is possible to compute explicitly the eigenvalue. This idea will be explored in the next months.

We consider a Merton problem corresponding to the optimization of two assets where the risky one follows the Dow Jones index and the second one has a sure reward. An empirical method has been found last year inducing very important return during a period from 1900 until 1980. It is close to the current practice of comparing the index with a filtered one. The result is much better than the standard stochastic control method which is based on an identified return process very often stationary. In fact the return process is not stationary and it is very difficult to identify the drift term locally but it is more easy to test if the drift is locally greater or smaller than the return of the sure asset. The corresponding theory is well studied for example in the Basseville Nikiforov book. We are exploring this point of view to justify the empirical method found. A paper on the subject is under development.

Motivated by the issue we have started to study the climate modeling literature to identify some domain where we can contribute. Our first interest was very simple model where the system is naturally stabilized by change of albedo when the temperature increases like the daisy world of Lovelock but they are too much simplistic. Our second interest was the global circulation model of the atmosphere and ocean to see if it is possible to design a toolbox in ScicosLab dedicated to climate simulation. After a while it appears clearly that it is almost impossible that this kind of models are able to make a prediction enough precise to be useful. Our third interest was the very rich literature of the sceptic about anthropic warming. In particular, we have looked more deeply to the radiation modeling of the atmosphere for which the influence of the atmospheric vapor and the clouds is largely debatable. At the end, it seems that there is still places for studies on classes of simplified model of the earth. For example the linearization of the dynamic of a simplified earth in thermal periodic regime could (by computation of eigenvectors) give information about the best places to measure the temperature to detect the abrupt change of the dynamics if there is one. This sensor optimization and detection problem could be an interesting research subject if the time life of the project METALAU was large enough.

The international Numerical Mathematics consortium (NMC) was founded to find a solution to this lack of standardization. It aims to provide recommendations and, standardized, well organized, information for a fundamental set of math functionalities applicable to most disciplines and so to reduce the overall cost of numerical algorithm development as well as to increase algorithm portability. Defining a standard implies to propose compliance checking and verification tools. Founded in 2005 the NMC already published three drafts of the standard focused on “semantic” function definitions. This year our work concerns further reflections on the validation and verification, the realization of a fourth draft of the standard and the establishment of a wiki to support the collaborative development of the standard.

Objective: continue the work in Simpa project, i.e., extending Scicos capabilities to allow the usage of “implicit blocks”. This is done by developing a Modelica compiler in collaboration with LMS-Imagine and interfacing it with Scicos. Examples are provided by EDF and IFP.

The objective of this project is the development of parallel numerical algorithms for real time simulation of systems of algebraic differential Equations. Imagine, Siemens, Lagep are other partners of the project.

The participant are Airbus Industries, Barco, EADS Astrium, FERIA, B Krates, INRIA, Israel Aircraft Industries, Siemens VDO Automotive (leader), allinn Technical University.

In this project, both industries and universities will seek to develop, use and make better tools and methods for design, development, integration and validation of embedded software-intensive systems. The overall goal of the development project is to create a proper prototype for industrial code generation starting from a diverse set of desktop system development tools in particular Matlab/Simulink/Stateflow and Scilab/scicos. Additional efforts will be necessary after the end of the project to make an industrial product out of this prototype, as well as certification activities for the respective industrial domains.

This ITEA European project started in 2006. Metalau experience in hybrid system modeling, simulation, and Code generation is an important factor in the success of this project.

This contract finished this year and we mainly work on the translator of Scicos diagrams to the latest Gene-Auto intermediate language and a lot of test on the complete code generation tool chain. The full Gene-Auto tool chain including the Scicos interface has been released under the GPL licence.

The objective is to develop Modelica libraries to make the Modelica language a standard powerful modeling environment. Metalau participates in this project to help shape the hybrid formalism in Modelica and develop associated library components, which can then be used in the Implicit Scicos environment.

The objective is to develop a simulation tool for new generations of communication and radar systems. Metalau participates in this project to improve the efficiency of the Scicos Code Generator notally by providing all the Scicos formalism available in the generated code. We also work around the portability of the generated code to embed a standalone Scicos simulator in software applications.

Altair Engineering Inc. has purchased a non-exclusive worldwide operating license of the software Scicos developed in Metalau.

R. Nikoukhah. Member of the International Program Committee of the Mediterranean Control and Automation Conference.

R. Nikoukhah. Member of IFAC Technical Committee on Fault Detection, Supervision and Safety in Technical Processes (SAFEPROCESS TC).

R. Nikoukhah. Member of International Program Committee for SAFEPROCESS.

R. Nikoukhah. Senior Member of IEEE.

E. Rofman. Nomination as ”Ad-Honorem“ adviser for Productive Science, Technology and Innovation by the Government of the Province of Santa Fe in Argentina.

S. Steer. Member of the Numerical Mathematics Consortium.

S. Steer. Member of the International Workshop on Open-source Software for Scientific Computation program committee.

North Carolina State University (USA): on failure detection and numerical solution of hybrid DAEs under the coordination of R. Nikoukhah.

OCSID (Optimisation et Contrôle des Systèmes Dynamiques) - Argentinian institute of Mathematics IAM-CONICET under the coordination of E. Rofman

Maurice Goursat, Serge Steer. This workshop on numerical simulation organized by the Tizi-Ouzou university (Algeria) gathered 40 teachers and researchers from Algeria, Tunisia and Morocco. This workshop included a Scilab/Scicos session. This workshop falls under a step of diffusion of the numerical computation culture through the Maghreb mathematical universities.

Serge Steer. Scilab, un logiciel libre pour le calcul scientifique, courses organized by the “Ecole Polytechnique”.

J.P. Quadrat

1rst Montreal Workshop on Idempotent and Tropical Mathematics, July 2009.

S. Steer

International Workshop on Open-source Software for Scientific Computation, Guyang, China, Sep 2009.

F. Delebecque

- DEA MMMEF, University of Paris 1. Systems Theory course.

S. Steer

- Polytechnique, chaire Thales: “master Conception et Management des Systèmes Informatiques Complexes (COMASIC)”, 2nd year.

Alireza Esna Ashari, supervised by Ramine Nikoukhah.

Damien Chapon, supervised by Ramine Nikoukhah.

Mohamad Koubar, supervised by Alan Layec.

Thu-Hien Nguyen Thi, supervised by François Delebecque.