Section: New Results

Security protocols

Analysis of Equivalence Properties

Participants : Vincent Cheval, Véronique Cortier, Ivan Gazeau, Steve Kremer, Itsaka Rakotonirina, Christophe Ringeissen.

Automatic tools based on symbolic models have been successful in analyzing security protocols. These tools are particularly well adapted for trace properties (e.g. secrecy or authentication). A wide range of security properties, such as anonymity properties in electronic voting and auctions, unlinkability in RFID protocols and mobile phone protocols, are however naturally expressed in terms of indistinguishability, which is not a trace property. Indistinguishability is naturally formalized as an observational or trace equivalence in cryptographic process calculi, such as the applied pi calculus. While several decision procedures have already been proposed for verifying equivalence properties the resulting tools are often rather limited, and lack efficiency.

Our results are centered around the development of several, complementary verification tools for verifying equivalence properties. These tools are complementary in terms of expressivity, precision and efficiency.

  • The Akiss tool provides good expressivity as it supports a large number of cryptographic primitives (including the XOR primitive, extremely popular in low energy devices such as RFID tags) and protocols with else branches. It allows verification for a bounded number of protocol sessions. The tool is precise for a class of determinate processes, and can approximate equivalence for other protocols. The tool however suffers from efficiency problems when the number of sessions increases. The computation can be partially distributed on different cores. To overcome these efficiency problems of the Akiss tool, Gazeau and Kremer completely revisit the theory underlying Akiss. Rather than enumerating the possible traces, the new version directly reasons about partial ordered traces. A new implementation is also in progress and the first results seem extremely promising.

  • The DEEPSEC tool is a recent tool that allows for user-defined cryptographic primitives that can be modelled as a subterm convergent rewrite system (slightly more restricted than AKISS), but supports the whole applied pi calculus, except for bounding the number of sessions. It is precise, in that it decides equivalence (without any approximations) and has good efficiency (slightly less than SAT-Equiv) for the class of determinate processes (where partial order reductions apply). To improve efficiency for non-determinate processes, Cheval, Kremer and Rakotonirina [21] develop new optimisation techniques. This is achieved through a new, stronger equivalence for which partial-order reductions are sound even for non-determinate processes, as well as new symmetry reductions. They demonstrate that these techniques provide a significant (several orders of magnitude) speed-up in practice, thus increasing the size of the protocols that can be analysed fully automatically. Even though the new equivalence is stronger, it is nevertheless coarse enough to avoid false attacks on most practical examples.

  • The SAT-Equiv tool relies on a “small-attack property”: if there is an attack against trace equivalence, then there is a well-typed attack, that is an attack where the messages follow some a priori given structure. This allows to dramatically reduce the search space. We have recently extended [11] this approach to a class of equational theory, that encompasses all standard cryptographic primitives (including e.g. randomized encryption) as well as theories that are less considered by automatic tools, such as threshold decryption. This result will allow to further extend the SAT-Equiv tool but can also be used more generally to characterize the form of an attack, independently of the considered tool.

From a more foundational point of view, in collaboration with Erbatur (LMU, Germany) and Marshall (Univ Mary Washington, USA), Ringeissen studies decision procedures for the intruder deduction and the static equivalence problems in combinations of subterm convergent rewrite systems and syntactic theories for which it is possible to apply a mutation principle to simplify equational proofs. As a continuation of a work initially presented at UNIF'18, it has been shown that a matching property is applicable to solve both intruder deduction and static equivalence. This matching property can be satisfied when using a matching algorithm known for syntactic theories [29]. A journal paper reporting this result is currently under review.

Decision Procedures for Equational Theories

Participants : Christophe Ringeissen, Michaël Rusinowitch.

Equational theories and unification procedures are widely used in protocol analyzers to model the capabilities of a (passive) intruder. In the context of protocol analysis, many equational theories of practical interest satisfy the finite variant property. This class of theories is indeed a class of syntactic theories admitting a terminating mutation-based unification algorithm. This mutation-based unification algorithm generalizes the syntactic unification algorithm known for the empty theory. In collaboration with Erbatur (LMU, Germany) and Marshall (Univ Mary Washington, USA), this particular unification algorithm has been applied by Ringeissen to get new non-disjoint combination results for the unification problem [23], [32].

In collaboration with Anantharaman (LIFO, Orléans), Hibbs (SUNY Albany & Google, USA), and Narendran (SUNY Albany, USA), Rusinowitch has studied the unification problem in list theories. Decision procedures for various list theories have been investigated in the literature with applications to automated verification. In [17], it has been shown that the unifiability problem for some list theories with a reverse operator is NP-complete. A unifiability algorithm is given for the case where the theories are extended with a length operator on lists.

Among theories with the finite variant property, the class of theories presented by subterm convergent rewrite systems is particularly remarkable because it satisfies in addition a locality property. For this class of theories, it is thus possible to get a satisfiability procedure based on a reduction to the empty theory via an instantiation with the finitely many terms occurring in the input problem. As an alternative to locality, Ringeissen has investigated a politeness property, in collaboration with Chocron (Insikt Intelligence, Spain) and Fontaine (Veridis project-team). This approach has led to new non-disjoint combination results for the satisfiability problem modulo data structure theories extended with some bridging functions such as the length operator on lists [10], [26].

Recast of ProVerif

Participants : Vincent Cheval, Véronique Cortier.

Motivated by the addition of global states in ProVerif, we have started a major revision of the popular tool ProVerif. This revision goes well beyond global states and is conducted in collaboration with Bruno Blanchet, the original and main developer of ProVerif. One of the first main changes is the addition of ProVerif of the notion of “lemmas” and “axioms” that can be added to either encode additional properties (axioms) or help ProVerif to prove the desired properties. It is indeed now possible to specify lemmas, that will significantly reduce the number of considered clauses in the saturation procedure of ProVerif. These lemmas should of course be proved themselves by ProVerif, possibly by induction thanks to a particular care of the order of literals in the saturation procedure. The new approach provides more flexibility in cases where ProVerif was not able to terminate or yield false attacks (e.g. in the presence of global states).

Moreover, even when ProVerif is able to prove security, the tool is suffering from efficiency issues when applied to complex industrial protocols (up to 1 month running time for the analysis of the NoiseExplorer protocol). One reason is the subsumption procedure: a clause shall not be added if it is subsumed by another one (that is, if there exists a more general clause). This is crucial to avoid running into non termination issues. We have started a major rewrite of the subsumption procedure, taking advantage of the recent progress in this domain, in the automated deduction area. Another reason is the translation of processes into Horn clauses: For each conditional in the process, ProVerif generates a Horn clause for each possible result of this conditional. On complex protocols with many interleaved conditionals, ProVerif is faced with an exponential blowup in the number of generated clauses. We have improved the generation of Horn clauses by avoiding exploring branches that would directly be subsumed by other conditional branches. The first experimental results show significant speed-up on many examples: On average, ProVerif is now 5 to 10 times faster than its current release, with some examples peaking at 50 to 200 times speedup.

Verification of Protocols with Global States

Participants : Jannik Dreier, Lucca Hirschi.

The TAMARIN prover is a state-of-the-art verification tool for cryptographic protocols in the symbolic model. Dreier, in collaboration with Hirschi, Sasse (ETH Zurich), and Radomirovic (Dundee), improved the underlying theory and the tool to deal with an equational theory modeling XOR operations. Exclusive-or (XOR) operations are common in cryptographic protocols, in particular in RFID protocols and electronic payment protocols. Although there are numerous applications, due to the inherent complexity of faithful models of XOR, there is only limited tool support for the verification of cryptographic protocols using XOR. This makes TAMARIN the first tool to support simultaneously this large set of equational theories, protocols with global mutable state, an unbounded number of sessions, and complex security properties including observational equivalence. We demonstrated the effectiveness of our approach by analyzing several protocols that rely on XOR, in particular multiple RFID-protocols, where we can identify attacks as well as provide proofs. These results were presented at CSF'18, an extended version was accepted in the Journal of Computer Security [12].

Symbolic Methods in Computational Cryptography Proofs

Participants : Charlie Jacomme, Steve Kremer.

Code-based game-playing is a popular methodology for proving the security of cryptographic constructions and side-channel countermeasures. This methodology relies on treating cryptographic proofs as an instance of relational program verification (between probabilistic programs), and decomposing the latter into a series of elementary relational program verification steps. Barthe (MPI on Security and Privacy, Bochum), Grégoire (Inria SAM), Jacomme, Kremer and Strub (LIX, École Polytechnique) develop principled methods for proving such elementary steps for probabilistic programs that operate over finite fields and related algebraic structures. They focus on three essential properties: program equivalence, information flow, and uniformity. We give characterizations of these properties based on deducibility and other notions from symbolic cryptography. They use (sometimes improve) tools from symbolic cryptography to obtain decision procedures or sound proof methods for program equivalence, information flow, and uniformity. Finally, they evaluate their approach using examples drawn from provable security and from side-channel analysis - for the latter, they focus on the masking countermeasure against differential power analysis. A partial implementation of our approach is integrated in EasyCrypt, a proof assistant for provable security, and in MaskVerif, a fully automated prover for masked implementations. This work was presented at CSF [18].

Analysis of Deployed Protocols

Participants : Sergiu Bursuc, Lucca Hirschi, Steve Kremer.

New Privacy Threat on 3G, 4G, and Upcoming 5G AKA Protocols

Mobile communications are used by more than two-thirds of the world population who expect security and privacy guarantees. The 3rd Generation Partnership Project (3GPP) responsible for the worldwide standardization of mobile communication has designed and mandated the use of the AKA protocol to protect the subscribers' mobile services. Even though privacy was a requirement, numerous subscriber location attacks have been demonstrated against AKA, some of which have been fixed or mitigated in the enhanced AKA protocol designed for 5G.

We found and reported [9] a new privacy attack against all variants of the AKA protocol, including 5G AKA, that breaches subscriber privacy more severely than known location privacy attacks do. Our attack exploits a new logical vulnerability we uncovered that would require dedicated fixes. We demonstrate the practical feasibility of our attack using low cost and widely available setups. Finally we conduct a security analysis of the vulnerability and discuss countermeasures to remedy our attack.

Our attack has later been considered to be a key issue in 5G  [38] by 3GPP (3rd Generation Partnership Project, responsible for the standardization of 3G, 4G, and 5G mobile networks). Since then, various vendors(Qualcomm, Gemalto, China Mobile, Mobile Thales Thales, Nokia Nokia, ZTE ZTE, and Huawei.) have proposed countermeasures, which are currently under discussion.

Contingent Payments

Bursuc and Kremer study protocols that rely on a public ledger infrastructure, concentrating on protocols for zero-knowledge contingent payment, whose security properties combine diverse notions of fairness and privacy. They argue that rigorous models are required for capturing the ledger semantics, the protocol-ledger interaction, the cryptographic primitives and, ultimately, the security properties one would like to achieve. Our focus is on a particular level of abstraction, where network messages are represented by a term algebra, protocol execution by state transition systems (e.g. multiset rewrite rules) and where the properties of interest can be analyzed with automated verification tools. They propose models for: (1) the rules guiding the ledger execution, taking the coin functionality of public ledgers such as Bitcoin as an example; (2) the security properties expected from ledger-based zero-knowledge contingent payment protocols; (3) two different security protocols that aim at achieving these properties relying on different ledger infrastructures; (4) reductions that allow simpler term algebras for homomorphic cryptographic schemes. Altogether, these models allow us to derive a first automated verification for ledger-based zero-knowledge contingent payment using the Tamarin prover. Furthermore, our models help in clarifying certain underlying assumptions, security and efficiency tradeoffs that should be taken into account when deploying protocols on the blockchain. This work was presented at ESORICS [20].