Hycomes has been created as a new team of the Rennes – Bretagne Atlantique Inria research center in July 2013. The team builds upon the most promising results of the former S4 team-project and of the Synchronics large scale initiative. Two topics in embedded system design are covered:

Hybrid systems modelling, with applications to the design of multi-physics embedded systems, often referenced as cyber-physical systems;

Contract-based design and interface theories, with applications to requirements engineering in the context of safety-critical systems design.

Systems industries today make extensive use of mathematical modeling tools to design computer controlled physical systems. This class of tools addresses the modeling of physical systems with models that are simpler than usual scientific computing problems by using only Ordinary Differential Equations (ODE) and Difference Equations but not Partial Differential Equations (PDE). This family of tools first emerged in the 1980's with SystemBuild by MatrixX (now distributed by National Instruments) followed soon by Simulink by Mathworks, with an impressive subsequent development.

In the early 90's control scientists from the University of Lund
(Sweden) realized that the above approach did not support component
based modeling of physical systems with
reuse

Despite these tools are now widely used by a number of engineers, they raise a number of technical difficulties. The meaning of some programs, their mathematical semantics, can be tainted with uncertainty. A main source of difficulty lies in the failure to properly handle the discrete and the continuous parts of systems, and their interaction. How the propagation of mode changes and resets should be handled? How to avoid artifacts due to the use of a global ODE solver causing unwanted coupling between seemingly non interacting subsystems? Also, the mixed use of an equational style for the continuous dynamics with an imperative style for the mode changes and resets is a source of difficulty when handling parallel composition. It is therefore not uncommon that tools return complex warnings for programs with many different suggested hints for fixing them. Yet, these “pathological” programs can still be executed, if wanted so, giving surprising results — See for instance the Simulink examples in , and .

Indeed this area suffers from the same difficulties that led to the development of the theory of synchronous languages as an effort to fix obscure compilation schemes for discrete time equation based languages in the 1980's. Our vision is that hybrid systems modeling tools deserve similar efforts in theory as synchronous languages did for the programming of embedded systems.

Non-Standard analysis plays a central role in our research on hybrid systems modeling , , , . The following text provides a brief summary of this theory and gives some hints on its usefulness in the context of hybrid systems modeling. This presentation is based on our paper , a chapter of Simon Bliudze's PhD thesis , and a recent presentation of non-standard analysis, not axiomatic in style, due to the mathematician Lindström .

Non-standard numbers allowed us to reconsider the semantics of hybrid
systems and propose a radical alternative to the *super-dense
time semantics* developed by Edward Lee and his team as part of the
Ptolemy II project, where cascades of successive instants can occur in
zero time by using *infinitesimal* and *non-standard
integers*. Remark that 1/ *non-standard semantics*
provides a framework that is familiar to the computer
scientist and at the same time efficient as a symbolic
abstraction. This makes it an excellent candidate for the development
of provably correct compilation schemes and type systems for hybrid
systems modeling languages.

Non-standard analysis was proposed by Abraham Robinson in the 1960s to allow the explicit manipulation of “infinitesimals” in analysis , , . Robinson's approach is axiomatic; he proposes adding three new axioms to the basic Zermelo-Fraenkel (ZFC) framework. There has been much debate in the mathematical community as to whether it is worth considering non-standard analysis instead of staying with the traditional one. We do not enter this debate. The important thing for us is that non-standard analysis allows the use of the non-standard discretization of continuous dynamics “as if” it was operational.

Not surprisingly, such an idea is quite ancient. Iwasaki et al. first proposed using non-standard analysis to discuss the nature of time in hybrid systems. Bliudze and Krob , have also used non-standard analysis as a mathematical support for defining a system theory for hybrid systems. They discuss in detail the notion of “system” and investigate computability issues. The formalization they propose closely follows that of Turing machines, with a memory tape and a control mechanism.

The introduction to non-standard analysis in is very
pleasant and we take the liberty to borrow it. This presentation was
originally due to Lindstrøm, see . Its interest is that it
does not require any fancy axiomatic material but only makes use of
the axiom of choice — actually a weaker form of it. The proposed
construction bears some resemblance to the construction of

We begin with an intuitive introduction to the construction of the
non-standard reals.
The goal is to augment

A first idea is to represent such additional numbers as convergent
sequences of reals. For example, elements infinitesimally close to the
real number zero are the sequences

Unfortunately, this way of defining *exactly one of the above sets is important and the
other two can be neglected*. This is achieved by fixing once and for
all a finitely additive positive measure

Now, once

For *filter*

the empty set does not belong to

Consequently, *ultra-filter*. At this point we
recall Zorn's lemma, known to be equivalent to the axiom of choice:

**Lemma 1 (Zorn's lemma)**
*Any partially ordered set $(X,\le )$ such that any chain in $X$
possesses an upper bound has a maximal element.*

A filter *free*
filter, meaning it contains no finite set. It can thus be extended to
a free ultra-filter over

**Lemma 2**
Any infinite set has a free ultra-filter.

Every free ultra-filter *not* true that

Now, fix an infinite set

**Lemma 3 (Transfer Principle)**
Every first order formula is true over

The above general construction can simply be applied to *standard part* of

To prove this, let

It is also of interest to apply the general construction
() to *non-standard natural numbers*.
The non-standard set *infinite natural numbers*, which are
equivalence classes of sequences of integers whose essential limit is

System companies such as automotive and aeronautic companies are facing significant difficulties due to the exponentially raising complexity of their products coupled with increasingly tight demands on functionality, correctness, and time-to-market. The cost of being late to market or of imperfections in the products is staggering as witnessed by the recent recalls and delivery delays that many major car and airplane manufacturers had to bear in the recent years. The specific root causes of these design problems are complex and relate to a number of issues ranging from design processes and relationships with different departments of the same company and with suppliers, to incomplete requirement specification and testing.

We believe the most promising means to address the challenges in systems engineering is to employ structured and formal design methodologies that seamlessly and coherently combine the various viewpoints of the design space (behavior, space, time, energy, reliability, ...), that provide the appropriate abstractions to manage the inherent complexity, and that can provide correct-by-construction implementations. The following technology issues must be addressed when developing new approaches to the design of complex systems:

The overall design flows for heterogeneous systems and the associated use of models across traditional boundaries are not well developed and understood. Relationships between different teams inside a same company, or between different stake-holders in the supplier chain, are not well supported by solid technical descriptions for the mutual obligations.

System requirements capture and analysis is in large part a heuristic process, where the informal text and natural language-based techniques in use today are facing significant challenges. Formal requirements engineering is in its infancy: mathematical models, formal analysis techniques and links to system implementation must be developed.

Dealing with variability, uncertainty, and life-cycle issues, such as extensibility of a product family, are not well-addressed using available systems engineering methodologies and tools.

The challenge is to address the entire process and not to consider only local solutions of methodology, tools, and models that ease part of the design.

*Contract-based design* has been proposed as a new approach to
the system design problem that is rigorous and effective in dealing
with the problems and challenges described before, and that, at the
same time, does not require a radical change in the way industrial
designers carry out their task as it cuts across design flows of
different type.
Indeed, contracts can be used almost everywhere and at nearly all
stages of system design, from early requirements capture, to embedded
computing infrastructure and detailed design involving circuits and
other hardware. Contracts explicitly handle pairs of properties,
respectively representing the assumptions on the environment and the
guarantees of the system under these assumptions. Intuitively, a
contract is a pair

Mathematical foundations for interfaces and requirements engineering that enable the design of frameworks and tools;

A system engineering framework and associated methodologies and tool sets that focus on system requirements modeling, contract specification, and verification at multiple abstraction layers.

A detailed bibliography on contract and interface theories for embedded system design can be found in . In a nutshell, contract and interface theories fall into two main categories:

By explicitly relying on the
notions of assumptions and guarantees, A/G-contracts are intuitive,
which makes them appealing for the engineer. In A/G-contracts,
assumptions and guarantees are just properties regarding the
behavior of a component and of its environment. The typical case is
when these properties are formal languages or sets of traces, which
includes the class of safety
properties , , , , . Contract
theories were initially developed as specification formalisms able
to refuse some inputs from the
environment . A/G-contracts were advocated
by the Speeds project . They
were further experimented in the framework of the CESAR
project , with the additional consideration of
*weak* and *strong* assumptions. This is still a very
active research topic, with several recent contributions dealing
with the timed and
probabilistic ,
viewpoints in system design, and even mixed-analog circuit design
.

Interfaces combine assumptions
and guarantees in a single, automata theoretic specification. Most
interface theories are based on Lynch Input/Output
Automata , . Interface
Automata , , ,
focus primarily on parallel composition and compatibility: Two
interfaces can be composed and are compatible if there is at least
one environment where they can work together. The idea is that the
resulting composition exposes as an interface the needed information
to ensure that incompatible pairs of states cannot be reached. This
can be achieved by using the possibility, for an Interface
Automaton, to refuse selected inputs from the environment in a given
state, which amounts to the implicit assumption that the environment
will never produce any of the refused inputs, when the interface is
in this state. Modal
Interfaces inherit from both
Interface Automata and the originally unrelated notion of Modal
Transition
System , , , . Modal
Interfaces are strictly more expressive than Interface Automata by
decoupling the I/O orientation of an event and its deontic
modalities (mandatory, allowed or forbidden). Informally, a
*must* transition is available in every component that realizes
the modal interface, while a *may* transition needs not
be. Research on interface theories is still very active. For
instance,
timed , , , , , ,
probabilistic ,
and energy-aware interface theories have
been proposed recently.

Requirements Engineering is one of the major concerns in large systems industries today, particularly so in sectors where certification prevails . DOORS projects collecting requirements are poorly structured and cannot be considered a formal modeling framework today. They are nothing more than an informal documentation enriched with hyperlinks. As examples, medium size sub-systems may have a few thousands requirements and the Rafale fighter aircraft has above 250,000 of them. For the Boeing 787, requirements were not stable while subcontractors performed the development of the fly-by-wire and of the landing gear subsystems.

We see Contract-Based Design and Interfaces Theories as innovative tools in support of Requirements Engineering. The Software Engineering community has extensively covered several aspects of Requirements Engineering, in particular:

the development and use of large and rich *ontologies*; and

the use of Model Driven Engineering technology for the structural aspects of requirements and resulting hyperlinks (to tests, documentation, PLM, architecture, and so on).

Behavioral models and properties, however, are not properly encompassed by the above approaches. This is the cause of a remaining gap between this phase of systems design and later phases where formal model based methods involving behavior have become prevalent—see the success of Matlab/Simulink/Scade technologies. We believe that our work on contract based design and interface theories is best suited to bridge this gap.

Hybrid systems modeling plays a particular role in the design of cyber-physical systems, eg. systems mixing physical devices, computing platforms, communication buses and control and diagnosis software. A faithful modeling of the physical environment is a key element in a successful design of a cyber-physical system.

Several types of physical components can be found in a system, for example: mechanical, hydraulic or electrical. Component models should cover several viewpoints. For instance, the three viewpoints of an electronic device would be its electrical, thermal and reliability models. All these viewpoints interact, and it is not possible to analyze any of them in isolation. Let alone these complex cross-viewpoint interactions, modeling physics requires refined mathematics. For instance, it is a misconception to assume that physical laws result in smooth dynamics that can be captured by systems of ordinary differential equations. On the contrary, physics is often nonsmooth, meaning that trajectories may be discontinuous — consider the example of colliding billiard balls. Physical systems are networks of elementary components. The dynamics of each component can often be captured by a simple (differential) equation. However, these (differential) equations are coupled by network equations (Kirchhoff laws, mechanical couplings, ...) resulting from the structure of the system. The end result, is a system mixing differential equations with linear or algebraic constraints: a system of differential algebraic equations (DAE).

The Hycomes team is focusing on the design of hybrid systems modeling languages with DAE and nonsmooth dynamics (Fillipov differential inclusions, or complementarity systems), with applications in the energy industry (power plants, smart grids), and in the railway, automotive and aeronautic industries — see section for a deeper insight on the research program.

The design of embedded systems onboard certified civil aircrafts, for instance navigation, fly-by-wire and FADEC (Full Authority Digital Engine Control) applications, has to follow a stringent discipline imposed by civil aviation authorities. Designers have to provide evidence that both the design process they used and the system under design meet several industry standards, including the well-known ED-79/ARP-4754A and the DO-178 A/B regarding hardware and software artifacts.

These standards prescribe that every feature of a design can be traced back to one or several system-level requirements. Conversely, evidence shall be provided that every requirement has been accounted for. Correctness, consistency, compatibility and completeness of requirements are four key properties described in the ED-79/ARP-4754A standard that should also be assessed every time requirements are transformed. This puts a high burden on designers, especially on the system architect: requirements capture and analysis is by large a heuristic and manual process.

Formal requirements engineering is in its infancy: mathematical models, formal analysis techniques and links to system implementation must be developed. We advocate the use of contract-based reasoning techniques (see section ) to support requirements engineering activities, during the early stages of the design process .

Mica is an Ocaml library developed by Benoît Caillaud implementing the Modal Interface algebra published in , . The purpose of Modal Interfaces is to provide a formal support to contract based design methods in the field of system engineering. Modal Interfaces enable compositional reasoning methods on I/O reactive systems.

In Mica, systems and interfaces are represented by extension. However, a careful design of the state and event heap enables the definition, composition and analysis of reasonably large systems and interfaces. The heap stores states and events in a hash table and ensures structural equality (there is no duplication). Therefore complex data-structures for states and events induce a very low overhead, as checking equality is done in constant time.

Thanks to the Inter module and the mica interactive environment, users can define complex systems and interfaces using Ocaml syntax. It is even possible to define parameterized components as Ocaml functions.

Mica is available as an open-source distribution, under the CeCILL-C
Free Software License Agreement
(http://

Flipflop is a Test and Flip net synthesis tool implementing a linear
algebraic polynomial time algorithm. Computations are done in the

TnF-C++ is a robust and portable re-implementation of Flipflop, developed in 2014 and integrated in the S3PM toolchain. Both software have been designed in the context of the S3PM project on surgical procedure modeling and simulation (see section ).

The main advances in 2014 of the Hycomes team have been as follows:

We have proposed a causality analysis, in the form of a simple type system, rejecting hybrid programs with algebraic circuits — see section .

We have proposed a conservative extension of the notion of differentiation index to hybrid systems with differential algebraic equations — see section .

Explicit hybrid systems modelers like Simulink / Stateflow allow for programming both discrete- and continuous-time behaviors with complex interactions between them. A key issue in their compilation is the static detection of algebraic or causality loops. Such loops can cause simulations to deadlock and prevent the generation of statically scheduled code. In (also published as a deliverable of the Sys2Soft collaborative project , see ), we address this issue for a hybrid modeling language that combines synchronous Lustre-like data-flow equations with Ordinary Differential Equations (ODEs). We introduce the operator last(x) for the left-limit of a signal x. This operator is used to break causality loops and permits a uniform treatment of discrete and continuous state variables. The semantics relies on non-standard analysis, defining an execution as a sequence of infinitesimally small steps. A signal is deemed causally correct when it can be computed sequentially and only progresses by infinitesimal steps outside of discrete events. The causality analysis takes the form of a simple type system. In well-typed programs, signals are proved continuous during integration and can be translated into sequential code for integration with off-the-shelf ODE solvers. The effectiveness of this system is illustrated with several examples written in Zélus, a Lustre-like synchronous language extended with hierarchical automata and ODEs.

Hybrid systems modelers exhibit a number of difficulties related to
the mix of continuous and discrete dynamics and sensitivity to the
discretization scheme. Modular modeling, where subsystems models can
be simply assembled with no rework, calls for using Differential
Algebraic Equations (DAE). In turn, DAE are strictly more difficult
than ODE. In most modeling and simulation tools, before simulation can
occur, sophisticated pre-processing is applied to DAE systems based on
the notion of differentiation index. Graph based algorithms such as
the one originally proposed by Pantelides
are efficient at finding the differentiation index of a DAE system,
structurally (i.e., outside some exceptional values for the system
parameters), solving the consistent initialisation problem and,
transforming a DAE system into a statically scheduled system of
ordinary differential equations (ODE) and implicit functions. The
differentiation index for DAE explicitly relies on everything being
differentiable. Therefore, extensions to hybrid systems must be done
with caution — to our knowledge, no such extension exists, supported
by a rigourous mathematical theory. In ,
we use non-standard analysis for this. Non-standard analysis
formalizes differential equations as discrete step transition systems
with an infinitesimal time basis. This allows to map hybrid DAE
systems to difference Algebraic Equations (dAE), for which the notion
of difference index can be used. The difference index of a dAE is
an easy transposition of the differentiation index of a DAE, where
forward shift in time (using a

Cyber-Physical Systems require distributed architectures to support safety critical real-time control. Hermann Kopetz' Time-Triggered Architecture (TTA) has been proposed as both an architecture and a comprehensive paradigm for systems architecture, for such systems. TTA offers the programmer a logical discrete time compliant with synchronous programming, together with timing bounds. A clock synchronization protocol is required, unless the local clocks used themselves provide the recquired accuracy. To relax the strict requirements on synchronization imposed by TTA, Loosely Time-Triggered Architectures (LTTA) have been proposed. In LTTA, computation and communication units are all triggered by autonomous, unsynchronized, clocks. Communication media act as shared memories between writers and readers and communication is non blocking. This is at the price of communication artifacts (such as duplication or loss of data), which must be compensated for by using some "LTTA protocol". In we have pursued our previous work by providing a unified presentation of the two variants of LTTA (token- and time-based), with simplified analyses. We compared these two variants regarding performance and robustness and we provide ways to combine them.

Ayman Aljarbouh's PhD is partially funded by an ARED grant of the Brittany Regional Council. His doctoral work takes place in the context of the Modrio and Sys2Soft projects on hybrid systems modeling — see sections and . Ayman Aljarbouh is working on accelerated simulation techniques for hybrid systems. In particular, he is focusing on the regularisation, at runtime, of chattering behaviour and the approximation of Zeno behaviour.

Benoît Caillaud is participating to the S3PM project of the
CominLabs excellence
laboratory

Program:« Briques génériques du logiciel embarqué » (Embedded Software Generic Building-Blocks)

Project acronym: Sys2soft

Project title: Physics Aware Software

Duration: June 2012 – April 2016

Coordinator: Dassault Systèmes (France)

Other partners: Thales TGS / TRT / TAS, Alstom Transport, Airbus, DPS, Obeo, Soyatec

Abstract: The Sys2soft project aims at developping methods and tools supporting the design of embedded software interacting with a complex physical environment. The project advocates a methodology where both physics and software are co-modeled and co-simulated early in the design process and embedded code is generated automatically from the joint physics and software models. Extensions of the Modelica language with synchronous programming features are being investigated, as a unified framework where interacting physical and software artifacts can be modeled.

Program: ITEA2

Project acronym: Modrio

Project title: Model Driven Physical Systems Operation

Duration: September 2012 – November 2015

Coordinator: EDF (France)

Other partners: ABB (Sweden), Ampère Laboratory / CNRS (France), Bielefeld University (Germany), Dassault Systèmes (Sweden), Dassault Aviation (France), DLR (Germany), DPS (France), EADS (France), Equa Simulation (Sweden), IFP (France), ITI (Germany), Ilmenau University (Germany), Katholic University of Leuven (Belgium), Knorr-Bremse (Germany), LMS (France and Belgium), Linköping University (Sweden), MathCore (Sweden), Modelon (Sweden), Pöry (Finland), Qtronic (Germany), SICS (Sweden), Scania (Sweden), Semantum (Finland), Sherpa Engineering (France), Siemens (Germany and Sweden), Simpack (Germany), SKF (Sweden), Supmeca (France), Triphase (Belgium), University of Calabria (Italy), VTT (Finland), Vattenfall (Sweden), Wapice (Finland).

Abstract: Modelling and simulation are efficient and widely used tools for system design. But they are seldom used for systems operation. However, most functionalities for system design are beneficial for system operation, provided that they are enhanced to deal with real operating situations. Through open standards the benefits of sharing compatible information and data become obvious: improved cooperation between the design and the operation communities, easier adaptation of operation procedures wrt. design evolutions. Open standards also foster general purpose technology. The objective of the ITEA 2 MODRIO project is to extend modelling and simulation tools based on open standards from system design to system operation.

Extending beyond the context of the Modrio project (see section , the Hycomes team is collaborating with the team of Dassault Systems, located in Lund (Sweden), in charge of developping Dymola, one of the major software tools in the Modelica community.

Benoît Caillaud has served on the program committee of the ACSD 2014 conference

Benoît Caillaud has been reviewing for the ACSD 2014, ACC 2014, CONCUR 2014, DATE 2014 and TACAS 2015 conferences.

Benoît Caillaud has been reviewing for the following journals: ACM Transactions on Embedded Computing, Acta Informatica and Emperical Software Engineering.

Master : Benoît Caillaud has taught an invited course on hybrid systems modeling in the first year master curriculum in computer science of ENS Rennes. This was a joint course with Marc Pouzet, representing a total of 27 hours EqTD.

PhD in progress : Ayman Aljarbouh, Accelerated simulation of hybrid systems, started December 2013, supervised by Benoît Caillaud

PhD in progress : Guillaume Baudart, Time-based protocols for LTTA, started December 2013, co-supervised by Albert Benveniste and Marc Pouzet (Parkas Inria project-team, ENS, Paris).