SECSI is a project common to INRIA and the Laboratoire Spécification et Vérification (LSV), itself a common lab between CNRS (UMR 8643) and the École Normale Supérieure (ENS) de Cachan. The team was created in 2001, and became an INRIA projet in December, 2002.

SECSI is a common project between INRIA Futurs and the LSV (Laboratoire Spécification et Vérification), itself a common research unit of CNRS (UMR 8643) and the ENS (École Normale Supérieure) de Cachan.

The SECSI project is a research project on the security of information systems. Originally, SECSI was organized around three main themes, and their mutual relationships:

Automated verification of cryptographic protocols;

Intrusion detection;

Static analysis of programs, in order to detect security holes and vulnerabilities at the protocol level.

This has changed. Starting from 2006, SECSI concentrates on the first theme, while keeping an eye on the other two.

In a nutshell, the aim of the SECSI project is to
*develop logic-based verification techniques for security properties of computer systems and networks*.

The thrust is towards more
*automation*(new automata-based, or theorem-proving based verification techniques), more
*properties*(not just secrecy or authentication, but e.g., coercion-resistance in electronic voting schemes), more
*realism*(e.g., cryptographic soundness theorems for formal models).

The new objectives of the SECSI project are:

Tree-automata based methods, automated deduction, and approximate/exact cryptographic protocol verification in the Dolev-Yao model.

Enriching the Dolev-Yao model with algebraic theories, and associated decision problems.

Computational soundness of formal models (Dolev-Yao, applied pi-calculus).

Indistinguishability proofs allowing us to handle more properties, e.g. anonymity.

Application to new security protocols, e.g. electonic voting protocols.

Security in the presence of probabilistic and demonic non-deterministic choices.

This section is unchanged from the SECSI 2006 report.

see model-checking.

a set of automated techniques aiming at ensuring that a formal model of some given computer system satisfies a given specification, typically written as a formula in some adequate logic.

a sequence of messages defining an interaction between two or more machines, programs, or people.

a protocol using cryptographic means, in particular encryption, that attempts to satisfy properties of secrecy, authentication, or other security properties.

Computer security has become more and more pressing as a concern since the mid 1990s. There are several reasons to this: cryptography is no longer a
*chasse réservée*of the military, and has become ubiquitous; and computer networks (e.g., the Internet) have grown considerably and have generated numerous opportunities for attacks and
misbehaviors, notably.

The aim of the SECSI project is to
*develop logic-based verification techniques for security properties of computer systems and networks*. Let us explain what this means, and what this does not mean.

First, the scope of the research at SECSI is a rather broad subset of computer security, although the core of SECSI's activities is on verifying cryptographic protocols. The SECSI group has tried to be as comprehensive as possible. Several security properties have been the focus of SECSI's research: weak and strong secrecy, authentication, anonymity, fairness in contract-signing notably. Several models, too: the Dolev-Yao model initially, but also process algebra models (spi-calcul, applied pi-calculus), and, more recently, the more realistic computational models favored by cryptographers. Several input formats, finally: either symbolic descriptions of protocols à la Needham-Schroeder, or programs that actually implement cryptographic protocols.

Apart from cryptographic protocols, the vision of the SECSI project is that computer security, being a global concern, should be taken as a whole, as far as possible. This is why one of the initial objectives of SECSI was also concerned with problems in intrusion detection, notably.

However, the aims of any project, including SECSI, have to be circumscribed somewhat. One of the key points in the aim of the SECSI project, stated above, is “logic-based”. SECSI aims at developing rigorous approaches to the verification of security. But the expertise of the members of SECSI are not in, say, numerical analysis or the quantitative evaluation of degrees of security, but in formal methods in logic. It is a founding theme of SECSI that logic matters in security, and opportunities are to be grabbed. This was definitely the case for the verification of cryptographic protocols. This was also the case for intrusion detection, where an original model-checking based approach to misuse detection was developed.

Then, another important point is “verification techniques”. The expertise of SECSI is not so much in designing protocols. Verifying protocols, formally, is a rather more arduous task. It is also particularly needed in cryptographic protocol security, where many protocols were flawed, despite published proofs.

Automated cryptographic protocol verification is certainly
*the*main theme of SECSI. While it was already the theme that kept most SECSI members busy at the time SECSI was created (2002), one might say that, as of 2006, all SECSI members work on
it. Accordingly, this theme was naturally subdivided into new objectives.

Tree-automata based methods, automated deduction, and approximate/exact cryptographic protocol verification in the Dolev-Yao model.

Enriching the Dolev-Yao model with algebraic theories, and associated decision problems.

Computational soundness of formal models (Dolev-Yao, applied pi-calculus).

Indistinguishability proofs allowing us to handle more properties, e.g. anonymity.

Application to new security protocols, e.g. electonic voting protocols.

Security in the presence of probabilistic and demonic non-deterministic choices.

The various efforts of the SECSI team are united by the reliance on
*logic*and rigorous methods. As already said in Section
, SECSI does not do any cryptology per se.

As far as cryptographic protocol verification is concerned, one popular kind of model is that of Dolev and Yao (after
, see
for a survey), where: the intruder can read and write on every
communication channel, and in effect has full control over the network; the intruder may encrypt, decrypt, build and destruct pairs, as many times as it wishes; and, finally, cryptographic
means are assumed to be
*perfect*. The latter in particular means that the only way to compute the plaintext
Mfrom the ciphertext
{
M}
_{K}is to decrypt the latter using the inverse key
K^{-1}. It also means that no ciphertext can be confused with any message that is not a ciphertext, and that
{
M}
_{K}= {
M
^{'}}
_{K'}implies
M=
M^{'}and
K=
K^{'}. Thus, messages can be simply encoded as first-order terms, a fact which has been used by many authors. This “perfect cryptgraphy” model has been extended to algebraic
properties of primitives (see
for a survey) which was one of the main themes of the RNTL project
PROUVÉ.

As soon as cryptography has been abstracted using a term algebra, first-order logic is relevant to security proofs: security proofs can be tackled from the automata-theoretic point of view or using automated deduction. In SECSI we contributed (and continue to contribute) to this line of research designing strategies and decision methods, e.g. , .

The thrust here is on
*more automation*.

It was slightly less clear in 2002 that the Dolev-Yao model required some definite extensions, in particular allowing for terms to be interpreted modulo some equational theory—the so-called
*algebraic*case. (But also to propertly handle specific code chaining techniques
.) Typical examples of theories of interest are modular exponentiation
over a fixed generator
g(application: Diffie-Hellman-like protocols)
or that of bitwise exclusive-or
. The PhD theses of Roger
, Verma
, and Cortier
display early (and influential!) research in this area. More recent
theses in SECSI are those of Delaune
, Lafourcade
and Bernat
. Cortier's thesis—which contains much more material than we can
describe—was awarded the SPECIF best PhD thesis award in 2003, and the Le Monde academic research prize in 2004. Delaune's thesis, funded by a CIFRE grant with France Télécom, was awarded the
“mention thèse remarquable” by France Télécom.

Following all these bright PhD theses, the main activities and results of SECSI during the period 2003–2006 were devoted to such more accurate formal models of cryptography. This resulted in several decision procedures or impossibility results (see for instance , , , ).

Nowadays, we continue to work in this area, for instance following an electronic purse case study from France Télécom . The main focus is however on extending the results to other security properties (see Section ) and combining theories, such as in , . Moreover, it is important to consider protocols in their context. For instance, a key distribution protocol can be used to establish a key which is then reused in another protocol. Different protocols reusing the same long-term keys or passwords may be separately secure, but insecure when executed in parallel. Some composition results guaranteeing that parallel composition preserves security properties have already been obtained in , , .

The thrust here is on
*more realism*, and
*more automation*.

One desirable goal that seemed totally out of reach in 2002 is to relate the Dolev-Yao notion of security, possibly in the algebraic case, to more realistic notions of security as used in the cryptographic community (e.g., IND-CPA and IND-CCA security). The latter define security as resistance to probabilistic polynomial-time attackers, while the Dolev-Yao models overlook any computational constraints. In other words, cryptographic security is about actual computers running attacks, and being unable to gain any significant advantage while interacting with your protocol.

Abadi and Rogaway initiated work in this domain , dealing with a constrained case of security against passive attackers. The domain has flourished in recent years, and SECSI is taking an active part in it, as part of the ARA SSIA Formacrypt project, whose members include Martín Abadi and Bruno Blanchet. A more recent French-Japanese also continues this research theme. One early paper on this topic is . Laurent Mazaré, a PhD student of Yassine Lakhnech on these themes, spent 6 months as postdoc at SECSI and worked actively on the connection between formal and computational models in the presence of bilinear maps, an emerging fundamental tool in extensions of Diffie-Hellman-like protocols among others (best paper at WITS'07 ). Other results include the case of soundness of formal methods in the case of adaptive attacks , soundness and decidability results in a framework meant to deal with off-line guessing attacks, but reaching far beyond . Recently, Comon-Lundh and Cortier have shown that the observational equivalence of the applied pi calculus implies computational indistinguishability which has been an open question for several years. Their result implies soundness of properties such as anonymity and strong secrecy modelled in terms of observational equivalence.

Objective 1.3 is quite probably the hottest topic for the years to come as far as verification of cryptographic protocols is concerned.

The thrust here is on
*more realism*. However, the purpose of FormaCrypt, and of SECSI in particular, is to relate cryptographic approaches to mechanizable formal approaches, hence
*more automation*is also sought after in this field.

Most of the research in activities 1.1, 1.2, 1.3 are mainly concerned with rather traditional security properties, namely secrecy or authentication—in general, (un)reachability properties. However, in cryptography many properties are formulated as indisitinguishability properties.

*Strong*notions of secrecy are not reachability properties, and in fact are not trace properties. Rather, they are characterized using contextual equivalences. A notion of bisimulation
complete for contextual equivalence in the spi-calculus was found by Cortier
. The cryptographic results of
relate cryptographic security to
*static equivalence*, a form of contextual equivalence well-suited to passive adversaries introduced in Abadi and Fournet's applied pi-calculus
. Notions of strong security and contextual equivalence have also been
studied in the framework of higher-order computation (a lambda-calculus with name creation and cryptographic primitives) by Zhang, using Kripke logical relations
,
,
. Zhang's thesis
was awarded the 2006 prize of the AFCRST (French-Chinese Association
for Scientific and Technical Research). Other examples of indistinguishability properties that we have studied are privacy-related properties such as those appearing in electronic voting
protocols
and offline guessing attacks
.

In SECSI, we have been working on decision procedures, combination and composition results for such equivalence properties. In particular, decision procedures for many equational theories , , , , combination and composition results have been achieved for static equivalence. In the active case we are also working on symbolic methods for deciding obervational equivalences , .

The thrust is on
*more properties*and
*more automation*.

In addition to classical, academic protocols, such as those presented in the “Clark Jacob library” , we have applied our methods to other protocols, and classes of protocols which often require to model new properties.

In this vein other properties and other protocols were studied:

Anonymity properties and electronic voting

Electronic voting schemes require the voter to be unable to prove his vote to a bully, a property named
*receipt-freeness*in the passive case and
*coercion-resistance*in the more demanding active case
. Anonymity, privacy, unlinkability and in general all opacity
properties are also the topic of objective 1.4.

Security APIs

*Security APIs*allow untrusted code to access sensitive resources in a secure way. A security API provides an interface between a trusted component, such as a smart card or
cryptographic security module, and the untrusted outside world such that no matter what sequence of commands in the interface are called, and no matter what the parameters, certain `good'
properties will continue to hold, e.g. the secret long term keys on the smartcard are never revealed. Analysis of security APIs is a new theme which has recently started in SECSI with the
arrival of Graham Steel. First results on the widely deployed standard PKCS#11 were presented in
.

Password-based protocols

*Guessing attacks*are attacks where a weak secret can be guessed, e.g. by brute force enumeration (passwords). Some protocols use passwords but are still immune to guessing attacks
,
, and a general decision procedure was proposed by Baudet
in the (realistic) offline case, using a definition of security
based on static equivalence.

Group protocols

Secrecy and authentication properties were examined in the challenging case of group protocols. See Roger's PhD thesis , and the paper . Antoine Mercier has started a PhD thesis on security properties of group protocols with Ralf Treinen and Steve Kremer, Fall 2006. First results on secrecy for an unbounded number of participants were presented in .

Electronic purse

We have worked on a challenging case study of an electronic purse protocol which was provided by France Télécom in the RNTL project PROUVÉ. The protocol relies on algebraic properties of a fragment of arithmetic, typically containing modular exponentiation. This case study motivated work on Associative-Commutative deducibility constraints and gave rise to new decidability results , .

Fair exchange and contract signing protocols

Boisseau studied contract-signing protocols (see his PhD thesis
); Kremer studied optimistic multi-party contract signing
protocols
, and fair exchange protocols
, where one of the crucial properties is
*fairness*(none of the signers can prove the contract signed to a third-party while the other has not yet signed), not secrecy.

Overall, objective 1.5 differs from the other objectives in providing a source of sundry exciting perspectives (other properties, other protocols, other models).

The thrust is on
*more properties*and
*more realism*, while
*more automation*is still a running concern.

While objective 1.3 (computational soundness) is important to reach the SECSI goal of
*more realism*, i.e., to show that security proofs in formal models have realistic implications, one will also have to consider some protocols for which no formal model exists that is
solely based on logic. This is the case for protocols whose security depends on probabilities, for example. The paradigmatic example is Chaum's dining cryptographers, whereby
Nagents try to determine whether one of them paid while not revealing the identity of the payer with any non-negligible probability. Chaum's protocol involves flipping coins, and any bias
in coin-flipping is known to result into possible attacks.

Probabilities are also needed to model realistic notions of anonymity, where the distribution of possible outputs of the protocol should not give any information on the distribution of the inputs. Here, models purely based on logic will miss an important point.

Work in this direction was conducted in 2006–2007 through the INRIA ARC ProNoBis, on finding appropriate models for mixing probabilistic choice and non-deterministic choice. Intuitively, protocols can be seen as the interaction between honest agents, who proceed deterministically or by tossing coins, and attackers, who can be thought of as always choosing the action that will defeat some security objective in the worst way. I.e., attackers run as demonic non-deterministic agents. Finding simple and usable models mixing probabilistic choice and demonic non-determinism is challenging in itself. SECSI is also exploring the possibility of including angelic non-determinism (e.g., specified but not yet implemented behavior from honest agents), and chaotic non-determinism. Finally, these models are explored both from the point of view of transition systems, and model-checking, even in the non-discrete case, and from the point of view of the semantics of programming languages, in particular of Moggi's monadic lambda-calculus.

The main originality in this line of work used to be the theory of
*convex games*and
*belief functions*
, which originated in economic circles in the 1950s and in statistics
in the 1960s. This evolved into the use of
*continuous previsions*
, similar to a notion invented in finance by Walley. Most of the
required fundamental theoretic results are now established, and practical applications should come by in 2008, e.g., adapting the semantics and results on observational equivalence for the
probabilistic applied pi-calculus of
.

The thrust here is on
*more properties*, and
*more realism*.

The application domains of SECSI cover a large part of computer security.

Cryptographic protocols are used in more and more domains today, including smart card protocols, enterprise servers, railroad network architectures, secured distributed graphic user interfaces, mobile telephony, on-line banking, on-line merchant sites, pay-per-view video, etc. The SECSI project is not tied to any specific domain as far as cryptographic protocols are concerned. Our industrial partners in this domain are Trusted Logic S.A., France Télécom R&D, and CRIL Technology.

Analyzing cryptographic protocols per se is fine, but a more realistic approach consists in analyzing actual code implementing specific roles of cryptographic protocols, such as
`ssh`or
`slogin`, which implement the SSL/TLS protocols
are are used on every personal computer running Unix today. SECSI
pioneered the domain
. We collaborate with EADS Innovation Works on analyzing
multi-threaded programs.

The SECSI project started in 2002 with a relatively large software basis: tools to parse, translate, and verify cryptographic protocols which are part of the RNTL project EVA (including
*CPV*,
*CPV2*,
*Securify*), a static analysis tool (
*CSur*), an intrusion detection tool (
*logWeaver*). These programs were started before SECSI was created.

The SPORE Web page was new in 2002. It is a public and open repository of cryptographic protocols. Its purpose is to collect information on cryptographic protocols, their design, proofs, attacks, at the international level.

2003 and 2004 brought new developments. In intrusion detection, a completely new project has started, which benefited from the lessons learned in the DICO project: faster, more versatile, the ORCHIDS intrusion detection system promises to become the most powerful intrusion detection system around.

In 2005, the development of ORCHIDS reached maturity. ORCHIDS works reliably in practice, and has been used so at the level of the local network of LSV, ENS Cachan. Several additional sensors have been added, including one based on comparing statistical entropy of network packets to detect corruption attacks on cryptographic protocols. A tool paper on ORCHIDS was presented at the CAV'2005 international conference, Edinburgh, Scotland .

In 2006-07, a new prototype, NetQi, was initiated to test ideas on predicting network faults and attacks. This consists of two parts. One collects data from a network, and infers
dependencies between services, between services and local files, and between local files, for example of the form “if
Afails then
Bmay fail”. This uses
N-gram based statistical techniques. The other exploits the dependency graphs thus obtained to detect scenarios that would violate some properties in an expressive game logic involving
temporal constraints
.

The CSur project consisted in developing a static analysis tool able to detect leakage of confidential data from programs written in C. Its design and development covered the period 2002-2004. The main challenge was to properly integrate Dolev-Yao style cryptographic protocol analysis with pointer alias analysis. Once development was over, a paper was published, which explains the techniques used. (A journal version was submitted in June 2005. No news since then.)

The
`h1`tool suite was created in 2004 to support the discovery for security proofs, to output corresponding formal proofs in the Coq proof assistant, and also to provide a suite of tools
allowing one to manipulate tree automata automatically
.

Finally the PROUVÉ parser library is the analoguous of the above mentionned tools of the RNTL project EVA for the PROUVÉ specification language.

The initial purpose of the
`h1`tool is to decide Nielson, Nielson and Seidl's decidable class
, as well as an automated abstraction engine that converts any clause
set to one in
.

The main application of
`h1`is to verify sets of clauses representing cryptographic protocols. It was shown by the author at the CSF'08 conference how
`h1mc`, the model-checker of the suite, could be used to produce
*Coq proofs of security*, in an automated way.

Since then, the journal version
lists additional case studies, and makes a thorough analysis of the
algorithmic details behind
`h1mc`.

The Auditd sensor was implemented as a part of the ORCHIDS intrusion detection system. Auditd permits to catch system events in linux 2.6 kernels which gives ORCHIDS the ability to detect attacks on such version of linux kernels. For instance, ORCHIDS is now able to detect a whole family of violent DOS (Denial Of Service) attacks on linux 2.6 kernels. ORCHIDS was also integrated to an hypervisor-based platform (Xen 3), which makes it able to run in a protected VM (Virtual Machine), while its sensors (auditd) are running in other VMs and reporting events to ORCHIDS. This architecture gives ORCHIDS the ability to supervise the whole architecture and to detect attacks on other virtual machines. This work was done in collaboration with Bertin technologies in the setting of the PFC, System@tic project.

`mkP11`is a tool that generates a formal model in a multiset rewriting logic of an RSA PKCS#11 compatible key management API. Such APIs are found on smartcards and USB security tokens,
for example. Each device is configured slightly differently in terms of possible operations. A tool called `APITool', developed at the University of Venice, extracts configuration information
from such a device by a pre-defined reverse-engineering process. The
`mkP11`tool compiles a formal model based on this information. The model constructed is suitable for the SAT based security protocol model checker, SATMC. If SATMC finds an attack,
`mkP11`converts the output back into a form suitable for APITool to execute it directly on the token.

`mkP11`is described in a paper currently under review for an international conference. Commercial entities including a major international bank have expressed interest in purchasing the
software, in combination with the APITool. An NDA has been signed covering continuation of development in collaboration with the University of Venice.

The intruder deduction problem is to decide if an intruder can compute a certain message
Tfrom a certain set of messages
M. The static equivalence problem is to decide if an intruder can distinguish between two sequences of messages
M_{1}and
M_{2}. Messages are modeled as terms and the cryptographic primitives are modeled as function symbols. The properties of the cryptographic primitives are modeled by an equational theory.

KISS (Knowledge in Security Protocols) is a tool that solves the intruder deduction problem and the static equivalence problem for a certain class of convergent equational theories. In particular, KISS is known to terminate in polynomial time for subterm convergent equational theories and for other equational theories useful in e-voting protocols such as blind signatures and trapdoor commitment.

The algorithm implemented in KISS is described in .

Most existing results focus on trace properties like secrecy or authentication. There are however several security properties, which cannot be defined (or cannot be naturally defined) as trace properties and require the notion of indistinguishably. Typical examples are anonymity, privacy related properties or statements closer to security properties used in cryptography.

As explained above, static equivalence is a cornerstone to provide decision procedures for observational equivalence.

In , Ştefan Ciobâcă, Stéphanie Delaune and Steve Kremer propose another representation of deducible terms to overcome this limitation. The procedure terminates on a wide range of equational theories. In particular, they obtain a new decidability result for the theory of trapdoor bit commitment encountered when studying electronic voting protocols. The algorithm has been implemented in the KiSs tool. This result also appear in the informal proceedings of the workshop Secret . A journal version of this work is currently under submission.

In , Stéphanie Delaune, in collaboration with Véronique Cortier (LORIA, France) shows that for a large class of protocols, observational equivalence actually coincides with trace equivalence, a notion simpler to reason with. Then, they reduce the decidability of trace equivalence to deciding symbolic equivalence, an equivalence relation introduced by M. Baudet . This yields the first decidability result of observational equivalence for a general class of equational theories.

The procedure proposed by Mathieu Baudet in for deciding symbolic equivalence is quite complex and cannot be implemented in its current state. In order to provide tool support to decide observational equivalence, Vincent Cheval, Hubert Comon-Lundh and Stéphanie Delaune currently work to design another procedure that will be more amenable to automation. This was the main topic of the internship of Vincent Cheval . This work in progress has been presented at the SecCo workshop .

In , Rohit Chadha, Stéphanie Delaune and Steve Kremer propose an epistemic logic for the applied pi calculus. This logic allows one to express reachability properties such as secrecy, but also equivalence based security properties such as anonymity. They also study the relationship between the formalization of privacy in electronic voting in term of epistemic formula and the one proposed in in terms of observational equivalence.

To enable formal and automated analysis of security protocols, one has to abstract implementations of cryptographic primitives by terms in a given algebra. However, the algebra can not be free, as cryptographic primitives have algebraic properties that are either relevant to their specification or else they can be simply observed in implementations at hand. These properties are sometimes essential for the execution of the protocol, but they also open the possibility for an attack, as they give to an intruder the means to deduce new information from the messages that he intercepts over the network.

In consequence, there was much work over the last few years towards enriching the Dolev-Yao model, originally based on a free algebra, with algebraic properties, modelled by equational theories. In this context, we have been interested in general decision procedures for the insecurity of protocols, that can be applied to classes of equational theories.

Current state-of-the-art tools and techniques have become efficient enough to analyze many protocols. However, these analyses are carried out in isolation, without necessarily taking into account other protocols which are executed in parallel. It is often assumed that participants share a key assumed abstracting away how this key has been distributed. It is therefore important to obtain composition results which allow to compose protocols. For instance such composition results aim at showing that if two protocols are secure indivdually then their parallel composition preserves the security guarantees of the protocols, even if some keying material is shared, or if the same password is reused. Another example of composition is to show that if a key exchange protocol is secure and if a protocol, relying on a shared key, guarantees a given property then these protocols can be composed sequentially. This allows to implement the shared key assumption by any secure key exchange protocol.

Security APIs allow untrusted code to access sensitive resources in a secure way. The idea is to design an interface between a trusted component, such as a smart card or cryptographic security module, and the untrusted outside world such that no matter what sequence of commands in the interface are called, and no matter what the parameters, certain good' properties will continue to hold, e.g. the secret long term keys on the smartcard are never revealed. Designing such interfaces is very tricky, and several vulnerabilities in APIs in common use have come to light in recent years.

APIs can be analysed formally in a similar way to protocols, by defining an abstract cryptographic model and exploring reachable states in the model. Recent work in the SECSI team involved designing a formal model for APIs that follow the widely used RSA PKCS#11 standard. In a journal paper , Delaune, Kremer and Steel show security results on various proprietary extensions to the standard, obtained using the NuSMv model checker. In a conference paper with Sibylle Fröschle (University of Oldenburg) , Steel showed how to extend these results to an unbounded model (i.e. arbitrary numbers of fresh cryptographic keys generated by the device). In joint work with Matteo Bortolozzo, Giovanni Marchetto and Riccardo Focardi at the University of Venice, Steel showed that many of the attacks discovered in theoretical models do indeed work on real deployed devices. A conference paper describing this work is currently under review. In joint work with Keighren and Aspinall at the Univeristy of Edinburgh, Steel showed how information flow technqiues may be adapted to the analysis of key management APIs . In a paper with Véronique Cortier (LORIA), Steel proposed a new key management API with proven security properties .

A major application area for security APIs is the cash machine network, where tamper-resistant hardware security modules protect customer PINs. In join work with Matteo Centenaro, Riccardo Focardi and Flaminia Luccio (University of Venice), Steel showed how PIN processing APIs can be analysed by information flow technqiues . A follow-up paper describes a practical scheme for improving PIN processing security without making wholesale changes to the current infrstucture .

Spurred by discussions with David Lubicz (DGA and U. Rennes I) and Nicolas Guillermin (DGA), we have made some first forays into algorithms that check the security of cryptographic hardware circuits, described in VHDL.

The aim is to help hardware designers in checking that the circuits they have designed do not leak any of some specifically marked sensitive locations. We currently check this in a suitable variant of the Dolev-Yao (symbolic) approach . In particular, for now we do not consider computational proofs of security. We have not considered any hardware-specific attacks either (DPA, EMA, fault injection, physical attacks), and our version of VHDL is still a crude approximation of the real thing. However, the approach is simple enough to show good promise.

The current approach is reminiscent of the Goubault-Larrecq and Parrennes 2005 approach for analyzing cryptographic programs written in C, and proceeds by a translation to the decidable class . Although pointers are not a problem as in C, delays and asynchrony must be handled explicitly.

Routing is the process of selecting paths in a network along which to send network traffic. Routing is performed for many kinds of networks, for example it is a central issue in mobile ad hoc networks, where mobile wireless devices have to autonomously organize their infrastructure. Secure routing protocols use cryptographic mechanisms in order to prevent a malicious node from compromising the discovered route.

Mathilde Arnaud, Véronique Cortier and Stéphanie Delaune present in a calculus for modeling and reasoning about security protocols, including in particular secured routing protocols. Their calculus extends standard symbolic models to take into account the characteristics of routing protocols and to model wireless communication in a more accurate way. They propose a decision procedure for analyzing routing protocols for a bounded number of sessions and for a fixed network topology.

Zero-knowledge proofs of knowledge (ZK-PoK) play an important role in many cryptographic applications. Direct anonymous attestation (DAA) and the identity mixer anonymous authentication system are first real world applications using ZK-PoK as building blocks. But although having been used for many years now, it remains challenging to design and implement sound ZK-PoK. In fact, the security of various protocols found in literature was flawed. For non-experts in the field it is often hard to design ZK-PoK, since a unified and easy to use theoretical framework on ZK-PoK is missing.

Well-structured transition systems (WSTS) are an important class of transition systems with infinitely many states, on which several verification problems remain decidable. Among WSTS one finds Petri nets and several extensions, lossy channel systems, certain abstractions of timed Petri nets, datanets, and certain process algebras.

The fundamental decidability results on WSTS, due to Finkel and Schnoebelen in a TCS paper of 1999, is that coverability is decidable on every WSTS (given a start state
s, and a goal state
t, can we reach a state above
tfrom
s?). This works by a simple, set-theoretic algorithm working its way
*backwards*from the goal state.

However, some other questions, such as whether a given state is
*bounded*(are there finitely many reachable states from the state?), or
*liveness*, cannot be handled this way. In the case of Petri nets, such questions can be solved by the Karp-Miller algorithm, which works its way
*forwards*from the start state
s.

Until now, all attempts to generalize the Karp-Miller algorithm to WSTS other than Petri nets have either failed or produced ad hoc algorithms, specific to a given kind of WSTS. Moreover, e.g., for lossy channel systems, no forward algorithm can actually terminate, contrarily to the Karp-Miller algorithm. All this contributes to a lack of understanding of forward verification algorithms for WSTS in the verification community.

With Alain Finkel (LSV, ENS Cachan), we have proposed the first true generalization of the Karp-Miller algorithm to general WSTS. This rests, first, on a suitable definition of a
*completion*of the state space
, generalizing the extra
+
components used in the Karp-Miller algorithm; second, on the design of a very
short procedure that computes the so-called
*clover*of a state in a complete WSTS, which is a finite representation of the set of states below any reachable state (see
). The clover procedure specializes to a form of the Karp-Miller
algorithm on Petri nets, but is conceptually much simpler, and works on any complete WSTS, in particular on those arising by completion from an
^{2}-WSTS. (All WSTS arising in practice are
^{2}-WSTS.)

Moreover, we characterize the cases where the clover procedure terminates exactly, as those cases where the complete WSTS is clover-flattable, i.e., is the projection of a
*flat*transition system, that is, one whose control is ensured by a finite automaton with no nested loop. It follows that the completion of every Petri net, for example, is
clover-flattable.

These results rest on Jean Goubault-Larrecq's discovery of the properties of Noetherian spaces (LICS 2007), which arose as a by-product from his study of semantic models mixing non-determinism and probabilities.

We have developed a categorical model of Girard's geometry of interaction that generalizes the Girard-Danos-Regnier algebra of weights
, in the guise of the so-called Danos-Regnier category
of a linear inverse monoid
M. The aim is to turn this into a categorical model of linear logic.

It was known that this could not be done by adding any equation to the usual presentations of the geometry of interaction. We have proved that this could not be achieved even by changing the
underlying linear inverse monoid
Maltogether, e.g., by changing the existing generators and relations.

However, we have shown that
was a categorical model of classical multiplicative linear logic, under mild conditions on
M, and that coherence completions à la Hu-Joyal could be used to build categorical models of full (classical) linear logic from just models of (classical) multiplicative linear logic.

Thus we obtained the first categorical models of full classical linear logic based on the geometry of interaction.

We proved that there was a deep duality between angelic and demonic non-determinism, in various semantic models of non-determinism alone, of probabilistic choice, and of mixed non-deterministic and probabilistic choice . This rests on and extends the so-called de Groot duality on stably compact spaces, and was first explained in an invited talk at the international Domains IX workshop, entitled “A Tale of Two Dualities” (U. Sussex, Brighton, UK, September 24, 2008).

The AVOTÉ project (
http://

Electronic voting promises the possibility of a convenient, efficient and secure facility for recording and tallying votes. However, the convenience of electronic elections comes with a risk of large-scale fraud and their security has seriously been questioned. In this project we propose to use formal methods to analyze electronic voting protocols. More precisely, we structure the project around four work-packages.

Formalizing protocols and security properties. Electronic voting protocols have to satisfy a variety of security properties that are specific to electronic elections, such as eligibility, verifiability and different kind of anonymity properties. In the literature these properties are generally stated intuitively and in natural language. Such informal definitions are at the origin of many security flaws. As a first step the participants therefore propose to give a formalization of the different security properties in a well-established language for protocol analysis.

Automated techniques for formal analysis. The participants propose to design algorithms to perform abstract analysis of a voting system against formally-stated security properties. From preliminary work it has already become clear that privacy preserving properties can be expressed as equivalences. Therefore, we will give a particular attention to automated techniques for deciding equivalences, such as static and observational equivalence in cryptographic pi-calculi. Static equivalence relies on an underlying equational theory axiomatizing the properties of the cryptographic functions (encryption, exclusive or, ...). Results exist for several interesting equational theories such as exclusive or, blind signature and other associative and commutative functions. However, many interesting equational theories useful for electronic voting are still lacking. The participants will also investigate a more modular approach based on combination results. More importantly the participants will develop algorithms for deciding observational equivalence: in particular symbolic decision procedures for deciding observational equivalence in the case of a bounded number of sessions putting the stress on equational theories with applications to electronic voting. These algorithms will be implemented in prototypes which are to be included in the AVISPA platform.

Computational aspects. There are two competing approaches to the verification of cryptographic protocols: the formal (also called Dolev-Yao) model and the complexity-theoretic model, also called the computational model, where the adversary can be any polynomial time probabilistic algorithm. While the complexity-theoretic framework is more realistic and gives stronger security guarantees, the symbolic framework allows for a higher level of automation. Because of this, effort has been spent during the last years in relating both frameworks with the goal of getting the best of both worlds: see the ARA Formacrypt section. The participants plan to continue this effort and investigate soundness results for cryptographic primitives related to electronic voting. Moreover, most of the existing results only hold for trace properties, which do not cover most properties in electronic elections. The participants of AVOTÉ plan to establish soundness results for these properties.

Case studies. The members of AVOTÉ will validate all of the results on several case studies from the literature, notably a real-life case study on an electronic voting protocol designed at the Université Catholique de Louvain. This protocol was trialled during the election of the university president in 2009. However, even though the fundamental needs of security are satisfied, no formal analysis of this protocol has been performed.

The Formacrypt project (
http://

Most efforts in cryptographic protocol verification use either the computational approach, in which messages are bitstrings, or the formal approach, in which messages are terms. The computational approach is more realistic but more difficult to automate. The goal of the Formacrypt project is to bridge the gap between these two approaches.

Several works have already begun linking these approaches, but they all have limitations. They generally put too strong security requirements on these primitives, and they do not allow one to compute the probability of an attack explicitly. The Formacrypt project offers three approaches in order to overcome these limitations.

In the direct approach, the goal is to design and implement a computationally sound, automated protocol prover. This prover, called CryptoVerif, builds computational proofs presented as sequences of so-called games: the first game corresponds to the real protocol, the next games are obtained by transformations so that the difference of probability between consecutive games is negligible, and the probability of success of an attack in the last game is obvious. The probability of success of an attack in the initial game can then be bounded.

The purpose of the intermediate approach is to design a computationally sound logic, by adapting and extending an existing modal logic (the Protocol Composition Logic), originally sound in the formal model. The definition of a new semantics for this logic and the addition of new predicates, specific to the computational model, was necessary.

In the modular approach, which was specifically explored by SECSI, the idea is to extend theorems that prove the computational soundness of formal proofs of protocols. This allows one to reuse existing tools. These extensions concern both security properties (fairness, secrecy of keys, etc.) and cryptographic primitives (symmetric encryption, hash functions, etc.) Additionally, weaker security properties are considered, for public-key encryption (resistance to chosen plaintext attacks) and for signatures (for electronic voting, for instance). This also involved studying the computational soundness of formal models based on equational theories, which represent more precisely the properties of cryptographic primitives. Finally, the computational soundness of formal models for guessing attacks (for weak secrets, such as passwords) will be investigated, too.

The REDPILL project is a DIGITEO project, started september 2009. The partners are SECSI and Bertin Technologies. The goal of the project is the detection of malware on virtualized platforms.

The PFC project (for: “PlateForme de Confiance”) is one of the projects of the System@tic Paris Region French cluster in complex systems design and management, see
http://

The goal of the project is the design and validation of secure and safe embedded applications, particularly aimed at upper administration, police and customs forces. Within this project, SECSI is particularly collaborating with Bertin Technologies on effective intrusion prevention in hypervisor-based computer systems using ORCHIDS. Hedi Benzina has joined the project in November 2008 as a temporary engineer.

Hedi Benzina has started a PhD thesis in October 2009, under the direction of Jean Goubault-Larrecq, and is funded by the Digiteo DIM project “RedPill: Malware Detection on Virtualized Architectures”, 2009-2012.

Jean Goubault-Larrecq made a critical evaluation of the Spidware security solution, based on Jeremy Briffaut's PIGA interposition tool, on account of Advitech Partners. Spidware is a startup company founded by researchers at ENSI Bourges and LIFO. Jean Goubault-Larrecq wrote a detailed, confidential report on the technical strengths and weaknesses of this product.

Jean Goubault-Larrecq is scientific coordinator of the ANR programme blanc project CPP (confiance, preuves, probabilités, 2009-2012). See the Wiki
http://

From the standpoint of SECSI, this project leverages the results obtained during the ARC ProNoBiS (2006-2007) and before on semantic models of mixed non-deterministic and probabilistic choice, and applies them to the design of static analyzers for floating-point programs, specifically airplane engine controllers. (The need comes from Dassault Aviation, and Hispano-Suiza plane engines—now Safran. They are both associated partners to the project.)

The whole project revolves around the automated evaluation of uncertainty, whether probabilistic or non-deterministic. This uncertainty arises because static analyzers must inherently work on approximate values, but also because the environmental values (pressure, temperature, speed) are known only up to some precision, or fluctuate around some central value; and finally because of round-off errors in floating-point computations.

This project is a focused collaborative project, supported by CNRS and the Japan Science and Technology agency. The main goals are similar to the Formacrypt project described above: the aim is to produce security proofs at a symbolic level, while deriving precise computational assumptions, under which the proofs can be transferred at the computational level.

The idea is to bring, on this focused research area, both cryptographers and specialists of formal methods, and both Japanese and French researchers. The activities include an annual meeting (the first one being organized in Japan, in April 2009) and visits on both sides. Hubert Comon-Lundh has been visiting the Research Center for Information Security during two years (partly supported by INRIA). Other visits from the French side include S. Kremer and S. Bursuc for instance.

On the result side, there is a joint paper (by H. Comon-Lundh, Y. Kawamoto and H. Sakurada), that appeared in the JSIAM letters (May 2009). This paper is about anonymity proofs for ring signatures, in an unbounded network. In this work, H. Comon-Lundh brought an expertise in formal methods and concurrency and the Japanese side an expertise in cryptographic primitives related to digital signatures.

This is typically the goal of the project: produce such collaborative results coming from two countries and two different research communities.

Hubert Comon-Lundh organized (with Y. Ito) the French Japanese workshop on computational and symbolic proofs of security (CosyProofs 09) in Atagawa heights, April 2009
http://

Stéphanie Delaune was a member of the “comité de sélection” for a “Maître de Conférences” position at the University of Lille.

Jean Goubault-Larrecq was a member of the AERES evaluation committee of LIENS, ENS Paris, January 12. He participated in the mid-term evaluation of the ANR SeSur 2007 programme in September, as vice-president of the evaluation committee. He participated in the continued evaluation of the SEC&SI competition (“Système d'Exploitation Cloisonné et Sécurisé pour l'Internaute”, ARPEGE programme; 2008-2010).

Jean Goubault-Larrecq was a member of the jury of the Gilles Kahn PhD thesis prize, awarded by the SPECIF association and under the patronage of the French Academy of Sciences.

Steve Kremer was a member of the “comité de sélection” for a “Maître de Conférences” position at Ensimag/Verimag associated ot a CEA chair.

Steve Kremer co-organized the 7th International Workshop on Security Issues in Concurrency (SecCo'09), Bologna, Italy, co-located with CONCUR'09.

Graham Steel co-organised the 3rd International Workshop on Analysis of Security APIs (ASA-3), a satellite of CSF 2009 in Long Island, NY, USA, July 2009.

He was interviewed for Wired magazine,
http://

Hedi Benzina held a part of the TPs of the course “Projet programmation réseau” for MPRI (Master Parisien de Recherche en Informatique) master level 1. Total amount (21h).

Sergiu Bursuc held exercise sessions for MPRI (Master Parisien de Recherche en Informatique) master level 1 courses of Advanced complexity (6h) and Tree automata techniques and applications (6h).

Ştefan Ciobâcă held the TPs (programming project) of the course Programmation 1.2 (ENS Cachan, first year=level L3) and part of the TDs (exercise sessions) for the course Algorithmique Avancée (ENS Cachan, first year=level L3) during the academic year 2008/2009. He is holding the TPs (programming project) of the course Programmation 1.2 (ENS Cachan, first year=level L3) and the TDs (exercise sessions) for the course Tree Automata and Applications (MPRI, level M1) during the academic year 2009/2010.

Hubert Comon-Lundh is teaching the logic course at the Bachelor level (L3) in ENS Cachan, the course on security protocols at the master level (M2, MPRI) and the logic course at the master level (M1) for the “agrégation de mathématiques”.

Jean Goubault-Larrecq gave the following courses: advanced complexity (ENS Cachan and ENS Paris, second year=level M1, 39h eq. TD), logic and computer science (i.e., lambda-calculus; ENS Cachan and ENS Paris, first year=level L3, 39h. eq. TD), automated deduction (MPRI, level M2, 18h eq. TD), complexity and logic (ENS Cachan, first year=level L3, 22h eq. TD), programming (ENS Cachan, first year=level L3, 36h eq. TD). He also participated to rehearsals of lessons of “agrégation”, ENS Cachan, 3rd year, 27h. eq. TD.

Steve Kremer gave part of a course on formal verification of security protocols in the course “Méthodes de vérification de sécurité” (verification methods for security) at the “Master Sécurité des Systèmes Informatiques”, second year, University Paris XII. Total amount: 9h (TD eq.).

Hubert Comon-Lundh supervised Sergiu Bursuc, a 3rd year PhD student working on the verification of security protocols. Since august 2007, S. Bursuc is co-supervised by S. Delaune.

Hubert Comon-Lundh and Stéphanie Delaune co-supervised Vincent Cheval, master student working on verification of equivalence based security properties. He started a PhD in Fall 2009.

Stéphanie Delaune and Jean Goubault-Larrecq supervised Mathilde Arnaud (co-advisor Véronique Cortier, LORIA) who started her PhD in Fall 2008 on verification of ad-hoc routing security protocols.

Stéphanie Delaune and Graham Steel co-supervised Morten Dahl (5 month intern from University of Aalborg), project `Analysing Privacy Properties of VANET Protocols'.

Steve Kremer and Ralf Treinen supervised Antoine Mercier who started his PhD in Fall 2006 on the automatic verification of group protocols and defended his PhD in Dec. 2009.

Steve Kremer and Jean Goubault-Larrecq supervised Ştefan Ciobâcă (co-advisor Véronique Cortier, LORIA) who started his PhD in Fall 2008 on the automatic verification of equivalence properties.

Graham Steel co-supervised Gavin Keighren (PhD student, Edinburgh), provisional thesis title: Information Flow techniques for API Analysis. Submission expected October 2010.

Hubert Comon-Lundh participated in the following PhD/habilitation thesis committees

Olivier Gauwin, Lille, Sept. 2009 (president of the committee)

Véronique Cortier, Nancy, Nov. 2009, habilitation thesis

Thomas Genet, Rennes, Nov. 2009, habilitation thesis

Sergiu Bursuc, Cachan, Dec. 2009

Antoine Mercier, Cachan, Dec. 2009

Stéphanie Delaune participated to the jury of Sergiu Bursuc, as his thesis advisors.

Jean Goubault-Larrecq was examiner of Romains Beauxis' PhD thesis (LIX, May 4). He was rapporteur of Marc de Falco's PhD thesis (Institut de Mathématiques, Luminy, May 28) and of Jean-Baptiste Voron's PhD thesis (Paris 6, December 9). He was examiner at Véronique Cortier's habilitation thesis (HDR, Nancy, November 18).

Steve Kremer participated to the jury of Antoine Mercier, as his thesis advisor.

Hubert Comon-Lundh participated in the following program committees of international conferences:

Conf. on Implementation and Application of Automata (CIAA), Sydney, 2009

Foundations of Software Technology and Theoretical Computer Science (FSTTCS), Kanpur, 2009.

Asian Computing Sciene Conference (ASIAN), Seoul, 2009

Foundations of Software Science and Computational Structures (FOSSACS), 2010

ACM ASIA Computer and Communications Security (ASIACCS), 2010

Hubert Comon-Lundh participated in the program committes of the following workshops:

French-Japanese workshop on computational security (chairman), Atagawa, April 2009

Workshop on Security and Rewriting (co-chair), Port Jefferson, July 2009

Workshop on Formal and Computational Cryptography (FCC), Port Jefferson, July 2009

Stéphanie Delaune was a member of the program committee of the 4th International Workshop on Security and Rewriting Technique (Secret 2009), the 7th International Workshop on Security Issues in Concurrency (SecCo 2009), and the Workshop on Foundations of Computer Security (FCS 2009).

Jean Goubault-Larrecq participated to the program committee of RTA 2009, and will be co-editor (with Ralf Treinen) of the special issue of the journal LMCS on selected papers from the conference.

Steve Kremer was program co-chair (with Michele Boreale) of the 7th International Workshop on Security Issues in Concurrency (SecCo'09), and member of the program committees for the 4th Benelux Workshop on Information and System Security (WISSEC'09), the 13th Annual Asian Computing Science Conference (Asian'09) and the Second international conference on E-voting and Identity (VOTE-ID'09).

Joe-Kai Tsay was a program committee member of the 12th Information Security Conference (ISC 2009).