The aim of the CARTE research team is to take into account adversity in computations, which is implied by actors whose behaviors are unknown or unclear. We call this notion adversary computation.
The project combines two approaches. The first one is the analysis of the behavior of systems, using tools coming from Continuous Computation Theory. The second approach is to build defenses with tools coming from logic, rewriting and, more generally, from Programming Theory.
The activities of the CARTE team are organized around two research actions:
Computation over Continuous Structures
Computer Virology.
From a historical point of view, the first official virus appeared in 1983 on Vax-PDP 11. At the same time, a series of papers was published which always remains a reference in computer virology: Thompson , Cohen and Adleman . The literature which explains and discusses practical issues is quite extensive , . However, there are only a few theoretical/scientific studies, which attempt to give a model of computer viruses.
A virus is essentially a self-replicating program inside an
adversary environment. Self-replication has a solid background
based on works on fixed point in
a virus infects programs by modifying them,
a virus copies itself and can mutate,
it spreads throughout a system.
The above scientific foundation justifies our position to use the word virus as a generic word for self-replicating malwares. There is yet a difference. A malware has a payload, and virus may not have one. For example, a worm is an autonomous self-replicating malware and so falls into our definition. In fact, the current malware taxonomy (virus, worms, trojans, ...) is unclear and subject to debate.
Classical recursion theory deals with computability over discrete structures (natural numbers, finite symbolic words). There is a growing community of researchers working on the extension of this theory to continuous structures arising in mathematics. One goal is to give foundations of numerical analysis, by studying the limitations of machines in terms of computability or complexity, when computing with real numbers. Classical questions are : if a function
While the notion of a computable function over discrete data is captured by the model of Turing machines, the situation is more delicate when the data are continuous, and several non-equivalent models exist. In this case, let us mention computable analysis, which relates computability to topology , ; the Blum-Shub-Smale model (BSS), where the real numbers are treated as elementary entities ; the General Purpose Analog Computer (GPAC) introduced by Shannon with continuous time.
The rewriting paradigm is now widely used for specifying, modelizing, programming and proving. It allows one to easily express deduction systems in a declarative way, and to express complex relations on infinite sets of states in a finite way, provided they are countable. Programming languages and environments with a rewriting based semantics have been developed ; see ASF+SDF , Maude , and Tom .
For basic rewriting, many techniques have been developed to prove properties of rewrite systems like confluence, completeness, consistency or various notions of termination. Proof methods have also been proposed for extensions of rewriting such as equational extensions, consisting of rewriting modulo a set of axioms, conditional extensions where rules are applied under certain conditions only, typed extensions, where rules are applied only if there is a type correspondence between the rule and the term to be rewritten, and constrained extensions, where rules are enriched by formulas to be satisfied , , .
An interesting aspect of the rewriting paradigm is that it allows automatable or semi-automatable correctness proofs for systems or programs: the properties of rewriting systems as those cited above are translatable to the deduction systems or programs they formalize and the proof techniques may directly apply to them.
Another interesting aspect is that it allows characteristics or properties of the modeled systems to be expressed as equational theorems, often automatically provable using the rewriting mechanism itself or induction techniques based on completion . Note that the rewriting and the completion mechanisms also enable transformation and simplification of formal systems or programs.
Applications of rewriting-based proofs to computer security are various. Approaches using rule-based specifications have recently been proposed for detection of computer viruses , . For several years, in our team, we have also been working in this direction. We already proposed an approach using rewriting techniques to abstract program behaviors for detecting suspicious or malicious programs , .
It is rightful to wonder why there are only a few fundamental studies on computer viruses while it is one of the important flaws in software engineering. The lack of theoretical studies explains maybe the weakness in the anticipation of computer diseases and the difficulty to improve defenses. For these reasons, we do think that it is worth exploring fundamental aspects, and in particular self-reproducing behaviors.
The crucial question is how to detect viruses or self-replicating malwares. Cohen demonstrated that this question is undecidable. The anti-virus heuristics are based on two methods. The first one consists in searching for virus signatures. A signature is a regular expression, which identifies a family of viruses. There are obvious defects. For example, an unknown virus will not be detected, like ones related to a 0-day exploit. We strongly suggest to have a look at the independent audit in order to understand the limits of this method. The second one consists in analyzing the behavior of a program by monitoring it. Following , this kind of methods is not yet really implemented. Moreover, the large number of false-positive implies this is barely usable. To end this short survey, intrusion detection encompasses virus detection. However, unlike computer virology, which has a solid scientific foundation as we have seen, the IDS notion of “malwares” with respect to some security policy is not well defined. The interested reader may consult .
The aim is to define security policies in order to prevent malware propagation. For this, we need (i) to define what is a computer in different programming languages and setting, (ii) to take into consideration resources like time and space. We think that formal methods like rewriting, type theory, logic, or formal languages, should help to define the notion of a formal immune system, which defines a certified protection.
This study on computer virology leads us to propose and construct a “high security lab” in which experiments can be done in respect with the French law.
Understanding computation theories for continuous systems leads to studying hardness of verification and
control of these systems. This has been used to discuss problems in fields as diverse as verification (see e.g.,
), control theory (see e.g., ), neural
networks (see e.g., ), and so on.
We are interested in the formal decidability of properties of
dynamical systems, such as reachability ,
the Skolem-Pisot problem , the computability of
the
Contrary to computability theory, complexity theory over continuous spaces is underdeveloped and not well understood. A central issue is the choice of the representation of objects by discrete data and its effects on the induced complexity notions. As for computability, it is well known that a representation is gauged by the topology it induces. However more structure is needed to capture the complexity notions: topologically equivalent representations may induce different classes of polynomial-time computable objects, e.g., developing a sound complexity theory over continuous structures would enable us to make abstract computability results more applicable by analyzing the corresponding complexity issues. We think that the preliminary step towards such a theory is the development of higher-order complexity, which we are currently carrying out.
In contrast with the discrete setting, it is of utmost importance to compare the various models of computation over the reals, as well as their associated complexity theories. In particular, we focus on the General Purpose Analog Computer of Claude Shannon , on recursive analysis , on the algebraic approach and on Markov computability . A crucial point for future investigations is to fill the gap between continuous and discrete computational models. This is one deep motivation of our work on computation theories for continuous systems.
The other research direction on dynamical systems we are interested in is the study of properties of adversary systems or programs, i.e., of systems whose behavior is unknown or indistinct, or which do not have classical expected properties. We would like to offer proof and verification tools, to guarantee the correctness of such systems. On one hand, we are interested in continuous and hybrid systems. In a mathematical sense, a hybrid system can be seen as a dynamical system, whose transition function does not satisfy the classical regularity hypotheses, like continuity, or continuity of its derivative. The properties to be verified are often expressed as reachability properties. For example, a safety property is often equivalent to (non-)reachability of a subset of unsure states from an initial configuration, or to stability (with its numerous variants like asymptotic stability, local stability, mortality, etc ...). Thus we will essentially focus on verification of these properties in various classes of dynamical systems.
We are also interested in rewriting techniques, used to describe dynamic systems, in particular in the adversary context. As they were initially developed in the context of automated deduction, the rewriting proof techniques, although now numerous, are not yet adapted to the complex framework of modelization and programming. An important stake in the domain is then to enrich them to provide realistic validation tools, both in providing finer rewriting formalisms and their associated proof techniques, and in developing new validation concepts in the adversary case, i.e., when usual properties of the systems like, for example, termination are not verified. For several years, we have been developing specific procedures for property proofs of rewriting, for the sake of programming, in particular with an inductive technique, already applied with success to termination under strategies , , , to weak termination , sufficient completeness and probabilistic termination . The last three results take place in the context of adversary computations, since they allow for proving that even a divergent program, in the sense where it does not terminate, can give the expected results. A common mechanism has been extracted from the above works, providing a generic inductive proof framework for properties of reduction relations, which can be parametrized by the property to be proved , . Provided program code can be translated into rule-based specifications, this approach can be applied to correctness proof of software in a larger context. A crucial element of safety and security of software systems is the problem of resources. We are working in the field of Implicit Computational Complexity. Interpretation based methods like Quasi-interpretations (QI) or sup-interpretations, are the approach we have been developing these last years , , . Implicit complexity is an approach to the analysis of the resources that are used by a program. Its tools come essentially from proof theory. The aim is to compile a program while certifying its complexity.
The paper published at the International Conference on Functional Programming (ICFP 2015) has given a positive answer to an open problem, conjectured to be true for a long time: the question is to know whether inductive and coinductive data types can be added to light logic based systems without breaking the complexity of the system (i.e. staying within the class of polynomial time computable functions). This issue is analog to the issue of adding inductive and coinductive data types to system F without breaking normalization, which is known to hold for a long time.
To tackle this challenging question, we have studied the problem of defining algebras and coalgebras in the Light Affine Lambda Calculus, a system characterizing the complexity class FPTIME. In this system, the principle of stratification limits the ways we can use parametric polymorphism, and in general the way we can write our programs. We have shown that while stratification poses some issues to the standard System F encodings, it still permits to encode some weak form of algebra and coalgebra. Using the algebra encoding one can define in the Light Affine Lambda Calculus the traditional inductive types. Unfortunately, the corresponding coalgebra encoding permits only a very limited form of coinductive data types. To extend this class, we have studied an extension of the Light Affine Lambda Calculus by distributive laws for the modality
Hugo Férée has received the Ackermann award for his PhD thesis “complexité d'ordre supérieur et analyse récursive”.
Functional Description
Codisasm is a new disassembly program which supports self-modifying code and code overlapping. Up to our knowledge, it is the first which copes both aspects of program obfuscation. The tool is based on the notion of “wave” developed in the group.
It is written in C and contains about 3k lines of code.
Contact: Fabrice Sabatier
Functional Description
DynamicTracer is a new tool with a public web interface which provides run traces of executable files. The trace is obtained by recording a dynamic execution in a safe environment. It contains instruction addresses, instruction opcodes and other optional information.
It is written in C++ and contains about 2.5k lines of code.
Contact: Fabrice Sabatier
Functional Description
Gorille (formerly MMDEX) is a virus detector based on morphological analysis. It is composed of our own disassembler tool, of a graph transformer and a specific tree-automaton implementation. The tool is used in the EU-Fiware project and by some other partners (e.g., DAVFI project).
It is written in C and contains about 100k lines of code.
APP License, IDDN.FR.001.300033.000.R.P.2009.000.10000, 2009.
Contact: Philippe Antoine
Complexity of stream functions and higher-order complexity. We have pursued our works on higher-order complexity and the complexity of stream functions. Both notions are closely related as any function from natural numbers to natural numbers can be seen as a stream (an infinite list) of natural numbers:
A characterization of the class of Basic Feasible Functionals using term rewrite systems on streams and interpretation methods has been proposed in . This result is part of Hugo Férée's PhD thesis for which he has obtained the Ackermann award.
In , we have provided some interpretation criteria useful to ensure two kinds of stream properties: space upper bounds and input/output upper bounds. Our space upper bounds criterion ensures global and local upper bounds on the size of each output stream element expressed in term of the maximal size of the input stream elements. The input/output upper bounds criterion considers instead the relations between the number of elements read from the input stream and the number of elements produced on the output stream.
The paper has extended the light affine lambda calculus with inductive and coinductive data types using the category theory notions of (weak) initial algebra and coalgebra.
Complexity analysis of Object-Oriented programs. We have proposed a type system based on non-interference and data ramification (tiering) principles in . We have captured the set of functions computable in polynomial time on OO programs. The studied language is general enough to capture most OO constructs and the characterization is quite expressive as it allows the analysis of a combination of imperative loops and of data ramification scheme based on Bellantoni and Cook’s safe recursion using function algebra.
Rice-like theorem for primitive recursive functions.
We have studied the following question: what are the properties of primitive recursive functions that are decidable (by a Turing machine), given a primitive recursive presentation of the function. We give a complete characterization of these properties. We show that they can be expressed as unions of elementary properties of being compressible. If
Parametrization of geometric figures. During the master internship of Diego Nava Saucedo, we have studied the semi-computability of geometric figures. A figure is semi-computable if there is a program that semi-decides whether a pixel intersects the figure. Our goal is to understand the semi-computability of a figure in terms of the parameters describing the figure. It turns out that the usual ways of parameterizing simple figures such as triangles, squares or disks do not behave well in terms of semi-computability. We have actually proved that no finite parametrization behaves well.
Symbolic Dynamics on Groups. In an effort to better understand the interplay of geometry and computability in tiling theory, E. Jeandel has studied tiling problems on general Cayley graphs, and has obtained a significant number of new results. He has proven that groups with an (strongly) aperiodic tiling system have decidable word problem , and provided examples of new groups (in particular monster groups) with such tiling systems, and proved that all nontrivial nilpotent groups have an aperiodic tiling system and an undecidable domino problem . He also showed how the new concept of translation-like actions from geometric group theory can be used to prove that many groups, in particular the Grigorchuk groups and most groups with a nontrivial center, have an undecidable domino problem .
The smallest aperiodic tileset. In a joint work with Michael Rao, E. Jeandel has proven that there exists an aperiocic set of 11 Wang tiles , and furthermore that this number is optimal.
On Weak Odd Domination and Graph-based Quantum Secret Sharing. In this work published in the journal Theoretical Computer Science , Simon Perdrix and his co-authors Sylvain Gravier, Jérôme Javelle and Mehdi Mhalla study weak odd domination in graphs and its application in quantum secret sharing. A weak odd dominated (WOD) set in a graph is a subset B of vertices for which there exists a distinct set of vertices C such that every vertex in B has an odd number of neighbors in C. They point out the connections of weak odd domination with odd domination, [σ,ρ]-domination, and perfect codes. They introduce bounds on κ(G), the maximum size of WOD sets of a graph G, and on κ′(G), the minimum size of non-WOD sets of G. Moreover, they prove that the corresponding decision problems are NP-complete. The study of weak odd domination is mainly motivated by the design of graph-based quantum secret sharing protocols: a graph G of order n corresponds to a secret sharing protocol whose threshold is κQ(G)=max(κ(G),n−κ′(G)). These graph-based protocols are very promising in terms of physical implementation, however all such graph-based protocols studied in the literature have quasi-unanimity thresholds (i.e. κQ(G)=n−o(n) where n is the order of the graph G underlying the protocol). In this paper, they show using probabilistic methods the existence of graphs with smaller κQ (i.e. κQ(G)≤0.811n where n is the order of G). They also prove that deciding for a given graph G whether κQ(G)≤k is NP-complete, which means that one cannot efficiently double check that a graph randomly generated has actually a κQ smaller than 0.811n.
Minimum Degree up to Local Complementation: Bounds, Parameterized Complexity, and Exact Algorithms.
In this work presented at ISAAC , David Cattaneo and Simon Perdrix introduce new upper bounds and exact algorithms for the local minimum degree. The author also prove the W[2]-membership of the corresponding decision problem. The local minimum degree of a graph is the minimum degree that can be reached by means of local complementation. For any n, there exist graphs of order n which have a local minimum degree at least
The ZX Calculus is incomplete for Clifford+T quantum mechanics. The ZX calculus is a diagrammatic language for quantum mechanics and quantum information processing. In this paper, Simon Perdrix and Harny Wang prove that the ZX-calculus is not complete for the Clifford+T quantum mechanics. The completeness for this fragment has been stated as one of the main current open problems in categorical quantum mechanics. The ZX calculus was known to be incomplete for quantum mechanics, on the other hand, it has been proved complete for Clifford quantum mechanics (a.k.a. stabilizer quantum mechanics), and for single-qubit Clifford+T quantum mechanics. The question of the completeness of the ZX calculus for Clifford+T is a crucial step in the development of the ZX calculus because of its (approximate) universality for quantum mechanics (i.e. any unitary evolution can be approximated using Clifford and T gates only). They exhibit a property which is know to be true in Clifford+T quantum mechanics and prove that this equation cannot be derived in the ZX calculus, by introducing a new sound interpretation of the ZX calculus in which this particular property does not hold. Finally, we propose to extend the language with a new axiom. This result has been presented as invited speakers in the conferences "Quantum Theory: from foundations to technologies" in Vaxjo Sweden, and "Higher TQFT and categorical quantum mechanics” at the Scrounger Institute in Vienna. The authors also presented these results at the workshop of the CNRS groupe de travail Informatique Quantique du GDR IM, in Grenoble.
Block Representation of Reversible Causal Graph Dynamics. In this work presented at the conference on Foundation of computer science (FCT’15) , Pablo Arrighi, Simon Martiel and Simon Perdrix, consider a reversible version of the causal graph dynamics. Causal Graph Dynamics extend Cellular Automata to arbitrary, bounded-degree, time-varying graphs. The whole graph evolves in discrete time steps, and this global evolution is required to have a number of physics-like symmetries: shift-invariance (it acts everywhere the same) and causality (information has a bounded speed of propagation). We study a further physics-like symmetry, namely reversibility. More precisely, we show that Reversible Causal Graph Dynamics can be represented as finite-depth circuits of local reversible gates.
Reversibility in the Extended Measurement-based Quantum Computation. In this work by Nidal Hamrit and Simon Perdrix has been presented at the conference on Reversible Computation in Grenoble . When applied on some particular quantum entangled states, measurements are universal for quantum computing. In particular, despite the fondamental probabilistic evolution of quantum measurements, any unitary evolution can be simulated by a measurement-based quantum computer (MBQC). They consider the extended version of the MBQC where each measurement can occur not only in the X,Y-plane of the Bloch sphere but also in the X,Z- and Y,Z-planes. The existence of a gflow in the underlying graph of the computation is a necessary and sufficient condition for a certain kind of determinism. They extend the focused gflow (a gflow in a particular normal form) defined for the X,Y-plane to the extended case, and provide necessary and sufficient conditions for the existence of such normal forms.
Quantum Circuits for the Unitary Permutation Problem.
In this paper presented at TAMC’15 Stefano Facchni and Simon Perdrix consider the Unitary Permutation problem which consists, given
Simon Perdrix is the principal investigator of the project “measurement-based quantum computing” funded by Région Lorraine and Université de Lorraine.
The team is a funding partner in ANR Elica (2014-2019), "Elargir les idées logistiques pour l'analyse de complexité". The Carte team is well-known for its expertise in implicit computational complexity.
The team is a funding partner in ANR Binsec (2013-2017), whose aim is to fill part of the gap between formal methods over executable code, and binary-level security analyses currently used in the security industry. Two main applicative domains are targeted: vulnerability analysis and virus detection. Two other closely related applications will also be investigated: crash analysis and program deobfuscation.
Submission of an Inria associate team proposal ACRA (Applications of Complexity to Resource Analysis) in collaboration with Computer Science and Engineering department, State University New York, Buffalo. The french principal investigator is Romain Péchoux, the US principal investigator is Marco Gaboardi.
An Hubert Curien Partnership (PHC) PHC Imhotep from the French Ministry of Foreign Affairs and with the support of the French Ministry of National Education and Ministry of Higher Education and Research holds between members of EPC Carte and Alexandria E-Just University.
Foundations of Quantum Computation: Syntax and Semantics (FoQCoSS), Regional Program STIC-AmSud. This 2-year project has been accepted in late 2015. The Argentinian-Brazilian-French consortium consists of: Pablo ARRIGHI (Université Aix-Marseille, France), Alejandro DIAZ-CARO (Universidad Nacional de Quilmes, Argentina), Gilles DOWEK (Inria, France), Juliana KAIZER VIZZOTTO (Universidade Federal de Santa Maria, Brazil), Simon PERDRIX (CNRS/Carte, France) and Benoît VALIRON (CentraleSupélec – LRI, France). The ultimate goal of this project is to study the foundations of quantum programming languages and related formalisms. With this goal in mind, we will study topics such as parallelism, probabilistic systems, isomorphisms, etc. The interest goes beyond having a working programming language for quantum computing; we are interested, on one hand, in its individual characteristics and its consequences for classical systems, and, on the other hand, in its implications for the foundations of quantum physics.
Walid Gomaa, associate professor at Alexandria E-Just University, was invited during two months (April and November) in the team.
Daniel Leivant, professor at Indiana University in Bloomington, was invited in June and July.
Mizuhito Ogawa was invited in the group to discuss about models of self-modifying code based on pushdown automata. He came back in October for further collaboration.
Guillaume Bonfante and Jean-Yves Marion participated to the organisation of the partnership meeting between JAIST and LORIA in Nancy, October 2015
Simon Perdrix has been member of the organizing committee of the Workshop of the Groupe de Travail informatique Quantique in Grenoble 26-27 November 2015.
Simon Perdrix has been member of the organizing committee of the Workshop of the Groupe de Travail GeoCal-LAC-LTP in Nancy 12-14 October 2015.
Guillaume Bonfante was co-chair of the 8th international symposium on foundation and practice of security 2015.
Guillaume Bonfante was in the Program Committee of Protection and Reverse Engineering Workshop 2015.
Guillaume Bonfante was in the Program Committee of the workshop on Logic and Computational Complexity (LCC) 2015.
Emmanuel Jeandel was in the Program Committee of Computability in Europe (CiE) 2015.
Jean-Yves Marion was in the Program Committee of the 8th international symposium on foundation and practice of security 2015.
Jean-Yves Marion was in the Program Committee of Protection and Reverse Engineering Workshop 2015.
Romain Péchoux was in the Program Committee of FOundational and Practical Aspects of Resource Analysis (FOPARA) 2015.
Simon Perdrix was in the Program Committee of Asian Quantum Information Science Conference (AQIS) 2015.
Simon Perdrix is in the Program Committee of Quantum Physics and Logic (QPL), forthcoming: QPL’16 in Glasgow in June 2016.
Mathieu Hoyrup reviewed articles for:
MFCS 2015
STACS 2015
Emmanuel Jeandel reviewed articles for:
MFCS 2015
STACS 2015
Romain Péchoux reviewed articles for:
FOPARA 2015
ISMVL 2015
FOSSACS 2016
Simon Perdrix reviewed articles for:
AQIS 2015
ICALP 2015
LICS 2015
QPL 2016
Emmanuel Hainry reviewed articles for:
Theoretical Computer Science
Applicable Analysis and Discrete Mathematics
Mathieu Hoyrup reviewed articles for:
Journal of Symbolic Logic
Information and Computation
Mathematical Structures in Computer Sciences
Logical Methods in Computer Science
Theory of Computing Systems
Emmanuel Jeandel reviewed articles for:
Discrete Mathematics and Theoretical Computer Science
Ergodic Theory and Dynamical Systems
Romain Péchoux reviewed articles for:
Computability - Journal of the association Computability In Europe
Information and Computation
Simon Perdrix reviewed articles for:
Quantum Information and Computation
Guillaume Bonfante was invited to give a talk on implicit complexity within NC at the Shonan Meeting on Low Level Code Analysis and Application to Computer Security.
Mathieu Hoyrup was invited to give a talk at the annual workshop Continuity, Computability, Constructivity (CCC 2015) in Kochel am See, Germany, September 2015.
Simon Perdrix was invited to give a talk:
“The ZX Calculus is incomplete for Clifford+T quantum mechanics”, at "Quantum Theory: from foundations to technologies – QTFT", Vaxjo, Sweden, June 2015,
“Supplementary of Interacting Frobenius Algebras” at the Workshop on “Higher topological quantum field theory and categorical quantum mechanics” Erwin Schrödinger International Institute, Vienna, October 2015
“Informatique quantique et théorie des graphes” at Journées Graphes et Algorithmes 2015, Orléans, Novembre 2015.
Romain Péchoux is external expert for the European commission Horizon 2020 program.
Isabelle Gnaedig is:
vice-leader of the team Carte.
Emmanuel Hainry is:
member of the CNU (Conseil National des Universités), Section 27.
organizer of the Carte Seminar.
Mathieu Hoyrup is:
principal investigator of a PHC Imhotep with Walid Gomaa (Alexandria E-Just University).
organizer of the Formal Methods Seminar at Loria.
Romain Péchoux is:
responsible of the Project-team Carte activity report 2015.
principal investigator of the Inria associate team Acra proposal.
Simon Perdrix is:
responsible of GT IQ (groupe de travail Informatique quantique) at the CNRS GdR IM (groupe de recherche Informatique Mathématique).
Unless explicitly stated, the teachings below are given at Université de Lorraine.
Licence:
Guillaume Bonfante
Java, L3, Mines Nancy
Emmanuel Hainry
Operating Systems, 30h, L1, IUT Nancy Brabois
Algorithmics, 40h, L1, IUT Nancy Brabois
Dynamic Web, 60h, L1, IUT Nancy Brabois
Databases, 30h, L1, IUT Nancy Brabois
Object Oriented Languages, 12h, L2, IUT Nancy Brabois
Complexity, 30h, L2, IUT Nancy Brabois
Mathieu Hoyrup
Programmation C, 15h, L1 PACES
Programmation JAVA, 56h, L1, IUT Charlemagne
Emmanuel Jeandel
Algorithmics and Programming 1, 60h, L1 Maths-Info
Algorithmics and Programming 4, 30h, L3 Informatique
Modelling Using Graph Theory, 30h, L3 Informatique
Networking, 15h, L3 Informatique
Data Compression, 45h, L2 Informatique
Romain Péchoux
Programmation orientée objet, 61,5h, L3 MIASHS
Programmation orientée objet, 53,5h, L2 MIASHS
Outils logiques pour l'informatique, 35h, L1 MIASHS
Bases de données, 40h, L3 Sciences de la Gestion
Algorithmic complexity, 30h, L3 MIAGE, IGA Casablanca, Morocco.
Simon Perdrix
Structure de données, 72h, L1, IUT Charlemagne
Licence : Modelling Using Graph Theory, 15h, L3 Informatique
Master:
Guillaume Bonfante
Modelling and UML, M1, Mines Nancy
Video Games, M1, Mines Nancy
Semantics, M1, Mines Nancy
Safety of Software, M2, Mines Nancy
Isabelle Gnaedig
Design of Safe Software, Coordination of the module, M2, Telecom-Nancy
Rule-based Programming, 20h, M2, Telecom-Nancy
Emmanuel Hainry
Implicit Complexity, 15h, M2 Informatique
Emmanuel Jeandel
Algorithmics and Complexity, 30h, M1 Informatique
Combinatorial Optimization, 36h, M1 Informatique
Romain Péchoux
Mathematics for computer science, 30h, M1 SCA
Advanced Java, 52,5h, M1 MIAGE
Implicit Complexity, 15h, M2 Informatique
PhD in progress: Hubert Godfroy, Semantics of Self-modifying Programs, Jean-Yves Marion (director).
PhD in progress: David Cattanéo, Combinatorial Modelization in Quantum Computation and Generalized Cover Problems, Pablo Arrighi (director), Simon Perdrix (co-advisor).
PhD: Thanh Dinh Ta, 11 May 2015, Malware Algebraic Modeling and Detection, started Sept. 2010, Jean-Yves Marion (director) and Guillaume Bonfante (co-advisor).
PhD: Aurélien Thierry, 11 March 2015, Morphological Analysis of Malware, Jean-Yves Marion (director).
PhD in progress: Paul Bakouche, Mesure de complexité en topologie de petites dimensions, Florian Deloup (co-advisor) and Guillaume Bonfante (director).
Isabelle Gnaedig was:
member of the Inria hiring committee for young researchers,
member of the Telecom-Nancy engineering school admission committee.
Emmanuel Jeandel was:
Reviewer and Examiner of Simon Martiel's PhD Defense on "Approches informatique et mathématique des dynamiques causales de graphes", defended in Université de Nice Sophia Antipolis, July 6th 2015
Reviewer of Ilkka Törmä's PhD on "Structural and Computational Existence Results for Multidimensional Subshifts", defended in Turku (Finland), July 31st 2015
Reviewer of Rodrigo Torres' PhD on "Some Dynamical Properties of Turing Machines Dynamical Models", to be defended in January 2016, Santiago de Chile (Chile)
Isabelle Gnaedig is member of the scientific vulgarization committee of Inria Nancy - Grand Est. This committee is a choice and guidance instance helping the direction of the center and the person in charge of popularization events to elaborate a strategy, to realize events and to help researchers to get involved in various actions aiming at popularizing our research themes, and more generally computer science and mathematics.
Mathieu Hoyrup wrote an article on his recent work for Images des Mathématiques.