The aim of the CARTE research team is to take into account adversity in computations, which is implied by actors whose behaviors are unknown or unclear. We call this notion adversary computation.

The project combines two approaches. The first one is the analysis of the behavior of systems, using tools coming from Continuous Computation Theory. The second approach is to build defenses with tools coming from logic, rewriting and, more generally, from Programming Theory.

The activities of the CARTE team are organized around two research actions:

Computation over Continuous Structures

Computer Virology.

From a historical point of view, the first official virus appeared in 1983 on Vax-PDP 11. At the same time, a series of papers was published which always remains a reference in computer virology: Thompson , Cohen and Adleman . The literature which explains and discusses practical issues is quite extensive , . However, there are only a few theoretical/scientific studies, which attempt to give a model of computer viruses.

A virus is essentially a self-replicating program inside an
adversary environment. Self-replication has a solid background
based on works on fixed point in

a virus infects programs by modifying them,

a virus copies itself and can mutate,

it spreads throughout a system.

The above scientific foundation justifies our position to use the word virus as a generic word for self-replicating malwares. There is yet a difference. A malware has a payload, and virus may not have one. For example, a worm is an autonomous self-replicating malware and so falls into our definition. In fact, the current malware taxonomy (virus, worms, trojans, ...) is unclear and subject to debate.

Classical recursion theory deals with computability over discrete structures (natural numbers, finite symbolic words). There is a growing community of researchers working on the extension of this theory to continuous structures arising in mathematics. One goal is to give foundations of numerical analysis, by studying the limitations of machines in terms of computability or complexity, when computing with real numbers. Classical questions are : if a function

While the notion of a computable function over discrete data is captured by the model of Turing machines, the situation is more delicate when the data are continuous, and several non-equivalent models exist. In this case, let us mention computable analysis, which relates computability to topology , ; the Blum-Shub-Smale model (BSS), where the real numbers are treated as elementary entities ; the General Purpose Analog Computer (GPAC) introduced by Shannon with continuous time.

The rewriting paradigm is now widely used for specifying, modelizing, programming and proving. It allows one to easily express deduction systems in a declarative way, and to express complex relations on infinite sets of states in a finite way, provided they are countable. Programming languages and environments with a rewriting based semantics have been developed ; see ASF+SDF , Maude , and Tom .

For basic rewriting, many techniques have been developed to prove properties of rewrite systems like confluence, completeness, consistency or various notions of termination. Proof methods have also been proposed for extensions of rewriting such as equational extensions, consisting of rewriting modulo a set of axioms, conditional extensions where rules are applied under certain conditions only, typed extensions, where rules are applied only if there is a type correspondence between the rule and the term to be rewritten, and constrained extensions, where rules are enriched by formulas to be satisfied , , .

An interesting aspect of the rewriting paradigm is that it allows automatable or semi-automatable correctness proofs for systems or programs: the properties of rewriting systems as those cited above are translatable to the deduction systems or programs they formalize and the proof techniques may directly apply to them.

Another interesting aspect is that it allows characteristics or properties of the modeled systems to be expressed as equational theorems, often automatically provable using the rewriting mechanism itself or induction techniques based on completion . Note that the rewriting and the completion mechanisms also enable transformation and simplification of formal systems or programs.

Applications of rewriting-based proofs to computer security are various. Approaches using rule-based specifications have recently been proposed for detection of computer viruses , . For several years, in our team, we have also been working in this direction. We already proposed an approach using rewriting techniques to abstract program behaviors for detecting suspicious or malicious programs , .

It is rightful to wonder why there are only a few fundamental studies on computer viruses while it is one of the important flaws in software engineering. The lack of theoretical studies explains maybe the weakness in the anticipation of computer diseases and the difficulty to improve defenses. For these reasons, we do think that it is worth exploring fundamental aspects, and in particular self-reproducing behaviors.

The crucial question is how to detect viruses or self-replicating malwares. Cohen demonstrated that this question is undecidable. The anti-virus heuristics are based on two methods. The first one consists in searching for virus signatures. A signature is a regular expression, which identifies a family of viruses. There are obvious defects. For example, an unknown virus will not be detected, like ones related to a 0-day exploit. We strongly suggest to have a look at the independent audit in order to understand the limits of this method. The second one consists in analyzing the behavior of a program by monitoring it. Following , this kind of methods is not yet really implemented. Moreover, the large number of false-positive implies this is barely usable. To end this short survey, intrusion detection encompasses virus detection. However, unlike computer virology, which has a solid scientific foundation as we have seen, the IDS notion of “malwares” with respect to some security policy is not well defined. The interested reader may consult .

The aim is to define security policies in order to prevent malware propagation. For this, we need (i) to define what is a computer in different programming languages and setting, (ii) to take into consideration resources like time and space. We think that formal methods like rewriting, type theory, logic, or formal languages, should help to define the notion of a formal immune system, which defines a certified protection.

This study on computer virology leads us to propose and construct a “high security lab” in which experiments can be done in respect with the French law.

Understanding computation theories for continuous systems leads to studying hardness of verification and
control of these systems. This has been used to discuss problems in fields as diverse as verification (see e.g.,
), control theory (see e.g., ), neural
networks (see e.g., ), and so on.
We are interested in the formal decidability of properties of
dynamical systems, such as reachability ,
the Skolem-Pisot problem , the computability of
the

Contrary to computability theory, complexity theory over continuous spaces is underdeveloped and not well understood. A central issue is the choice of the representation of objects by discrete data and its effects on the induced complexity notions. As for computability, it is well known that a representation is gauged by the topology it induces. However more structure is needed to capture the complexity notions: topologically equivalent representations may induce different classes of polynomial-time computable objects, e.g., developing a sound complexity theory over continuous structures would enable us to make abstract computability results more applicable by analyzing the corresponding complexity issues. We think that the preliminary step towards such a theory is the development of higher-order complexity, which we are currently carrying out.

In contrast with the discrete setting, it is of utmost importance to compare the various models of computation over the reals, as well as their associated complexity theories. In particular, we focus on the General Purpose Analog Computer of Claude Shannon , on recursive analysis , on the algebraic approach and on Markov computability . A crucial point for future investigations is to fill the gap between continuous and discrete computational models. This is one deep motivation of our work on computation theories for continuous systems.

The other research direction on dynamical systems we are interested in is the study of properties of adversary systems or programs, i.e., of systems whose behavior is unknown or indistinct, or which do not have classical expected properties. We would like to offer proof and verification tools, to guarantee the correctness of such systems. On one hand, we are interested in continuous and hybrid systems. In a mathematical sense, a hybrid system can be seen as a dynamical system, whose transition function does not satisfy the classical regularity hypotheses, like continuity, or continuity of its derivative. The properties to be verified are often expressed as reachability properties. For example, a safety property is often equivalent to (non-)reachability of a subset of unsure states from an initial configuration, or to stability (with its numerous variants like asymptotic stability, local stability, mortality, etc ...). Thus we will essentially focus on verification of these properties in various classes of dynamical systems.

We are also interested in rewriting techniques, used to describe dynamic systems, in particular in the adversary context. As they were initially developed in the context of automated deduction, the rewriting proof techniques, although now numerous, are not yet adapted to the complex framework of modelization and programming. An important stake in the domain is then to enrich them to provide realistic validation tools, both in providing finer rewriting formalisms and their associated proof techniques, and in developing new validation concepts in the adversary case, i.e., when usual properties of the systems like, for example, termination are not verified. For several years, we have been developing specific procedures for property proofs of rewriting, for the sake of programming, in particular with an inductive technique, already applied with success to termination under strategies , , , to weak termination , sufficient completeness and probabilistic termination . The last three results take place in the context of adversary computations, since they allow for proving that even a divergent program, in the sense where it does not terminate, can give the expected results. A common mechanism has been extracted from the above works, providing a generic inductive proof framework for properties of reduction relations, which can be parametrized by the property to be proved , . Provided program code can be translated into rule-based specifications, this approach can be applied to correctness proof of software in a larger context. A crucial element of safety and security of software systems is the problem of resources. We are working in the field of Implicit Computational Complexity. Interpretation based methods like Quasi-interpretations (QI) or sup-interpretations, are the approach we have been developing these last years , , . Implicit complexity is an approach to the analysis of the resources that are used by a program. Its tools come essentially from proof theory. The aim is to compile a program while certifying its complexity.

The Marie Curie RISE project *Computing with Infinite Data* coordinated by Dieter Spreen (Siegen University), in which Mathieu Hoyrup is participating, has been accepted. It will start in April 2017.

**ZX-calculus**

The ZX-calculus is a powerful diagrammatic language for quantum mechanics and quantum information processing. The completeness of the ZX-calculus is crucial: the language would be complete if any equation involving two diagrams representing the same quantum evolution can be derived using the rules of the language. While the language is known to be incomplete in general with no obvious way to add some new rules , two interesting fragments have been studied: the

The

The

**Causal Graph Dynamics**

Causal Graph Dynamics extend Cellular Automata to arbitrary, bounded-degree, time-varying graphs. The whole graph evolves in discrete time steps, and this global evolution is required to have a number of physics-like symmetries: shift-invariance (it acts everywhere the same) and causality (information has a bounded speed of propagation). We add a further physics-like symmetry, namely reversibility. This result has been presented at RC 2016 .

We have written a full journal paper, accepted in Information and Computation (special issue of DICE 2015), on the complexity analysis of Object Oriented programming languages based on tiered types. The corresponding type system provides sound and complete characterization of the set of polynomial time computable functions. As a consequence, the heap-space and the stack-space requirements of typed programs are also bounded polynomially. This type system is inspired by previous works on Implicit Computational Complexity, using tiering and non-interference techniques. The presented methodology has several advantages. First, it provides explicit big O polynomial upper bounds to the programmer, hence its use could allow the programmer to avoid memory errors. Second, type checking is decidable in polynomial time. Last, it has a good expressivity since it analyzes most object oriented features like inheritance, overload, override and recursion. Moreover it can deal with loops guarded by objects and can also be extended to statements that alter the control flow like break or return.

**Decidable properties of subrecursive functions**

We have studied the following problem : given a subrecursive class (like the primitive recursive functions, the polynomial-time computable functions, etc.) and a sound and complete programming language for that class, what are the properties of functions that are decidable (by a Turing machine), given a program for that function in the restricted language? We give a complete characterization of these properties. We show that they can be expressed as unions of elementary properties of being compressible. If

We also prove that such a characterization does not hold for the whole class of total recursive functions, and leave the problem open for that class.

The results appears in an article presented at ICALP 2016 .

**Baire category and computability theory**

Baire category is a very powerful tool in mathematical analysis to prove existence of objects with prescribed properties without having to explicitly build them, but showing instead that the class of objects with these properties is large in some sense. In Computability theory one often builds objects with very specific properties, notably to separate classes, and the proofs are often very involved. We show how Baire category can be adapted in order to be applied to computability theory, to prove existence results without the need of an explicit construction. We review notions that we introduced in the last years and provide new results in an invited paper at CiE 2016 .

The density classification problem is a simple computational problem where a distributed system composed of many cells need to find the majority state in its initial configuration. It is known that no deterministic cellular automaton can solve this problem without making errors. On the other hand, it was shown that a probabilistic mixture of the traffic rule and the majority rule solves the one-dimensional problem correctly with a probability arbitrarily close to one. We investigated the possibility of a similar approach in two dimensions and introduced a companion problem, the particle spacing problem, as an intermediary step. We showed that although this second problem does not have a cellular automaton solution, the use of randomized frameworks, via interacting particle systems, could allow us to have interesting solutions, which were analysed with a theoretical approach and with numerical simulations .

In the same direction of research, we studied how to coordinate a team of agents to locate a hidden source on a two-dimensional discrete grid. The challenge here is to find the position of the source with only sporadic detections. This problem arises in various situations, for instance when insects emit pheromones to attract their partners. A search mechanism named infotaxis was previously proposed to explain how agents may progressively approach the source by using only intermittent detections.

We studied the problem of doing a collective infotaxis search with agents that are almost memoryless. We presented a bio-inspired model which mixes stochastic cellular automata and reactive multi-agent systems. The model, inspired by the behaviour of the social amoeba *Dictyostelium discoideum*, relies on the use of reaction-diffusion waves to guide the agents to the source. The random emissions of waves allows the formation of a group of amoebae, which successively act as emitters of waves or listeners, according to their local perceptions. Our worked showed that the model is worth considering and may provide a simple solution to coordinate a team to perform a distributed form of infotaxis .

We participate in a PEPS project “Jeux quantiques sans probabilités”. The partners are Mehdi Mhalla (CR CNRS, LIG, coordinator), Pablo Arrighi (Prof. Aix-Marseille), Paul Dorbec (MdC, U. Bordeaux), Frédéric Magniez (DR CNRS, IRIF), Simon Perdrix (CR CNRS, CARTE).

The team is a funding partner in ANR Elica (2014-2019), "Elargir les idées logistiques pour l'analyse de complexité". The CARTE team is well-known for its expertise in implicit computational complexity.

Mathieu Hoyrup participates in the Marie-Curie RISE project *Computing with Infinite Data* coordinated by Dieter Spreen (Univ. Siegen) that has been accepted and will start in April 2017.

An Hubert Curien Partnership (PHC) PHC Imhotep from the French Ministry of Foreign Affairs and with the support of the French Ministry of National Education and Ministry of Higher Education and Research holds between members of EPC CARTE and Alexandria E-Just University.

Foundations of Quantum Computation: Syntax and Semantics (FoQCoSS), Regional Program STIC-AmSud. This 2-year project has been accepted in late 2015. The Argentinian-Brazilian-French consortium consists of: Pablo ARRIGHI (Université Aix-Marseille, France), Alejandro DIAZ-CARO (Universidad Nacional de Quilmes, Argentina), Gilles DOWEK (Inria, France), Juliana KAIZER VIZZOTTO (Universidade Federal de Santa Maria, Brazil), Simon PERDRIX (CNRS/CARTE, France) and Benoît VALIRON (CentraleSupélec – LRI, France). The ultimate goal of this project is to study the foundations of quantum programming languages and related formalisms. With this goal in mind, we will need to study topics such as parallelism, probabilistic systems, isomorphisms, etc., which constitute subjects of study by themselves. The interest goes beyond having a working programming language for quantum computing; we are interested, on one hand, in its individual characteristics and its consequences for classical systems, and, on the other hand, in its implications for the foundations of quantum physics.

Walid Gomaa, associate professor at Alexandria E-Just University, was invited during two months (March and May) in the team in the PHC Imhotep.

Arinta Auza (ENS Cachan / Indonesie)

Nazim Fatès was invited for a short stay at the Technische Universtät Dresden, in the Centre for Information Services and High Performance Computing (ZIH), in the team of Andreas Deutsch, head of Department for Innovative Methods of Computing. He gave a talk at the monthly ZIH colloquium.

Simon Perdrix spent one month at the Simons Institute for Theoretical Computer Science at Berkeley, University of California, as an invited researcher during the semester of Logic and Computation (mid-November to Mid-December 2016)

Nazim Fatès was a co-organiser of ACA'16 (Fourth International Workshop on Asynchronous Cellular Automata and Asynchronous Discrete Models), a workshop which was held during the ACRI 2016 conference, Fez (Morocco), September 8, 2016.

Nazim Fatès was member of the Program Committees of ANTS'16 (10th International Conference on Swarm Intelligence), AUTOMATA'16 (22nd International Workshop on Cellular Automata and Discrete Complex Systems) and ACRI'16 (12th International conference on Cellular Automata for Research and Industry).

Emmanuel Hainry was member of the Program Committee of Developments in Implicit Computational Complexity (DICE) 2016.

Mathieu Hoyrup was member of the Program Committee of Computability and Complexity in Analysis (CCA) 2016.

Romain Péchoux was member of the Program Committee of Ressource Aware Computation (RAC) 2016.

Simon Perdrix was member of the Program Committees of QPL’16 Quantum Physics and Logic, and IQFA’16 7th IQFA’s Colloquium.

Emmanuel Hainry reviewed articles for DICE and ICALP.

Mathieu Hoyrup reviewed articles for ICALP and STACS.

Emmanuel Jeandel reviewed articles for STACS and MFCS.

Romain Péchoux reviewed articles for FOSSACS, ISMVL, RAC, LFA and STACS

Nazim Fatès is a member of the editorial board of the *Journal of cellular automata*.

Emmanuel Jeandel is member of the editorial board of RAIRO-ITA

Nazim Fatès reviewed articles for *Natural Computing*, the *Journal of statistical physics*, *Advanced in Complex systems*, the *Journal of cellular automata*.

Emmanuel Hainry reviewed an article for Applicable Analysis and Discrete Mathematics.

Mathieu Hoyrup reviewed articles for Bulletin of Symbolic Logic, Memoirs of the American Mathematical Society, Transactions of the American Mathematical Society.

Emmanuel Jeandel reviewed articles for Advances in Mathematics, Journal of Discrete Algorithms and Ergodic Theory and Dynamical Systems.

Romain Péchoux reviewed articles for AMS Mathematical Review, Information & Computation.

Simon Perdrix reviewed articles for Quantum Information and Computation.

Mathieu Hoyrup was invited to give a talk in the special session “Constructive and computable analysis” of the conference Computability in Europe (CiE) in Paris, June 2016.

Emmanuel Jeandel gave a talk for the national days of GDR-IM.

Nazim Fatès is the vice-chair of the IFIP working group 1.05 on Cellular Automata and Discrete Complex Systems.

Simon Perdrix co-organised the quantum software workshop during the one-day event on quantum Technologies at the french Ministry of Research (July 5, 2016).

Emmanuel Jeandel reviewed projects for Agence Nationale de la Recherche

Isabelle Gnaedig is:

vice-leader of the team CARTE,

member of the scientific mediation committee at Inria Nancy Grand-Est.

Emmanuel Hainry is:

member of the CNU (Conseil National des Universités), Section 27.

organizer of the CARTE Seminar.

examiner for the admission exam of ENS and École Polytechnique.

Mathieu Hoyrup is:

principal investigator of a PHC Imhotep with Walid Gomaa (Alexandria E-Just University).

organizer of the Formal Methods Seminar at Loria.

Simon Perdrix:

is responsible of GT IQ (groupe de travail Informatique quantique) at the CNRS GdR IM (groupe de recherche Informatique Mathématique).

has been elected member and scientific secretary at CoNRS (Comité National de la Recherche Scientifique) section 6.

Licence:

Isabelle Gnaedig

To the limits of the computable, 6 hours, Opening course-conference of the collegium "Lorraine INP", Nancy France

Emmanuel Hainry

Systèmes d'exploitation, 30h, L1, IUT Nancy Brabois, Université de Lorraine, France

Algorithmique, 40h, L1, IUT Nancy Brabois, Université de Lorraine, France

Web dynamique, 60h, L1, IUT Nancy Brabois, Université de Lorraine, France

Bases de données, 30h, L1, IUT Nancy Brabois, Université de Lorraine, France

Programmation objet, 12h, L2, IUT Nancy Braboi, Université de Lorraine, Frances

Complexité, 30h, L2, IUT Nancy Brabois, Université de Lorraine, France

Mathieu Hoyrup

Bases de la Programmation Orientée Objet, 20 HETD, L2, Université de Lorraine, France

Interfaces Graphiques, 10 HETD, L2, Université de Lorraine, France

Emmanuel Jeandel

Algorithmics and Programming 1, 60h, L1 Maths-Info

Algorithmics and Programming 4, 30h, L3 Informatique

Modelling Using Graph Theory, 30h, L3 Informatique

Networking, 15h, L3 Informatique

Data Compression, 45h, L2 Informatique

Romain Péchoux

Programmation orientée objet, 61,5h, L3 MIASHS

Programmation orientée objet, 53,5h, L2 MIASHS

Outils logiques pour l'informatique, 35h, L1 MIASHS

Bases de données, 40h, L3 Sciences de la Gestion

Algorithmic complexity, 30h, L3 MIAGE, IGA Casablanca, Morocco.

Master:

Nazim Fatès

Systèmes complexes adaptatifs, 15h ETD, M2, UL, France.

Agents intelligents et collectifs, 22h ETD, M1, UL, France.

Isabelle Gnaedig

Design of Safe Software, Coordination of the module, M2, Telecom-Nancy (Université de Lorraine), Nancy, France,

Rule-based Programming, 20 hours, M2, Telecom-Nancy (Université de Lorraine), Nancy, France.

Emmanuel Hainry

Complexity and Complex Systems, 12h, M2, FST, Université de Lorraine, France

Emmanuel Jeandel

Algorithmics and Complexity, 30h, M1 Informatique

Combinatorial Optimization, 36h, M1 Informatique

Romain Péchoux

Mathematics for computer science, 30h, M1 SCA

Advanced Java, 52,5h, M1 MIAGE

Implicit Complexity, 15h, M2 Informatique

Simon Perfrix

Pépites Algorithmiques, 6h, M1/M2 at Ecole des Mines de Nancy.

Emmanuel Jeandel and Simon Perdrix supervised the Master Thesis of Renaud Vilmart on ZX-calculs, and the Master Thesis of Arinta Auza-Primandini on quantum circuits with memory.

Emmanuel Jeandel and Simon Perdrix are advisors of Renaud Vilmart, PhD student (UL) since October 2016.

Romain Péchoux is coadvisor of Pierre Mercuriali, PhD student, Université de Lorraine (50%, advisor: Miguel Couceiro, PR, Université de Lorraine).

Mathieu Hoyrup participated in the jury of the PhD of Ludovic Patey, Université Paris Diderot, February 26.

Emmanuel Jeandel reviewed the PhD thesis of Rodrigo Torres (Universidad de Concepción, Chile) in January, and participated in the PhD defense of Benoît Chappet de Vangel, Université de Lorraine, November 14th.

Nazim Fatès contributed to the collective book *Lettres à Turing* (ed. Thierry Marchaisse, May 2016), which addresses the legacy of Turing in our Modern Times. He was invited to discuss this book and the question of artificial intelligence in three national radio programs:

France Culture, La marche des sciences, “Cher alan Turing”, 1 hour, with Aurélie Luneau, 23 June 2016.

RFI, Autour de la question, “Que devons-nous à Alan Turing?”, 1 hour, with Sophie Joubert, 24 June 2016.

RFI, Autour de la question, “Jusqu’où ira l’intelligence artificielle?”, 1 hour, with Sophie Joubert, 7 October 2016.

Nazim Fatès participated to an open discussion (table ronde) on the theme of artificial intelligence (“Intelligence artificielle : quel monde prépare-t-elle ?”), invitation by the Cercle universitaire of Enghien-les-bains, on the 27th of Septembre in Enghien-les-bains. He was interviewed by Eric Chaverou, journalist at France Culture for his radio program of May 20, 2016, on the theme: “L'intelligence artificielle made in France”. This interview is available on the website of the radio program or directly via soundcloud. He participated to a public debate on the theme “Jusqu'où ira l'intelligence artificielle ?” the Café des sciences et techniques, organised by the CNAM, in Épinal, 21 January 2016.