Grace combines expertise and deep knowledge in algorithmic number theory and algebraic geometry, to build and analyse (public-key) cryptosystems, design new error correcting codes, with real-world concerns like cybersecurity or blockchains (software and hardware implementations, secure implementations in constrained environments, countermeasures against side channel attacks, white box cryptography).

The foundations of Grace therefore lie in algorithmic number theory (fundamental algorithms primality, factorization), number fields, the arithmetic geometry of curves, algebraic geometry and the theory of algebraic codes.

Arithmetic Geometry is the meeting point of algebraic geometry and number theory: the study of geometric objects defined over arithmetic number systems. In our case, the most important objects are curves and their Jacobians over finite fields; these are fundamental to our applications in both coding theory and cryptology. Jacobians of curves are excellent candidates for cryptographic groups when constructing efficient instances of public-key cryptosystems, of which Diffie–Hellman key exchange is an instructive example.

Coding Theory studies originated with the idea of using redundancy in messages to protect them against noise and errors. While the last decade of the 20th century has seen the success of so-called iterative decoding methods, we see now many new ideas in the realm of algebraic coding, with the foremost example being list decoding, (zero knowledge or not) proofs of computation.

Part of the activities of the team are oriented towards post-quantum cryptography, either based on elliptic curves (isogenies) or code-based. Also the team study relevant cryptography for the blockchain arena.

The group is strongly invested in cybersecurity: software security, secure hardware implementations, privacy, etc.

Algorithmic Number Theory is concerned with replacing special cases with general algorithms to solve problems in number theory. In the Grace project, it appears in three main threads:

Clearly, we use computer algebra in many ways. Research in cryptology
has motivated a renewed interest in Algorithmic Number Theory in
recent decades—but the fundamental problems still exist per
se. Indeed, while algorithmic number theory application in
cryptanalysis is epitomized by applying factorization to breaking RSA
public key, many other problems, are relevant to various area of
computer science. Roughly speaking, the problems of the cryptological
world are of bounded size, whereas Algorithmic Number Theory is also
concerned with asymptotic results.

Theme: Arithmetic Geometry: Curves and their Jacobians

is the meeting point of algebraic geometry and number theory: that is, the study of geometric objects defined over arithmetic number systems (such as the integers and finite fields). The fundamental objects for our applications in both coding theory and cryptology are curves and their Jacobians over finite fields.

An algebraic plane curve

(Not every curve is planar—we may have more variables, and more
defining equations—but from an algorithmic point of view,
we can always reduce to the plane setting.)
The genusJacobian of

The simplest curves with nontrivial Jacobians are
curves of genus 1,
known as elliptic curves;
they are typically defined by equations of the form

Theme: Curve-Based Cryptology

Jacobians of curves are excellent candidates for cryptographic groups when constructing efficient instances of public-key cryptosystems. Diffie–Hellman key exchange is an instructive example.

Suppose Alice and Bob want to establish a secure communication
channel. Essentially, this means establishing a common secret
key, which they will then use for encryption and decryption.
Some decades ago, they would have exchanged this key in person, or
through some trusted intermediary; in the modern, networked world,
this is typically impossible, and in any case completely unscalable.
Alice and Bob may be anonymous parties who want to do e-business, for
example, in which case they cannot securely meet, and they have no way
to be sure of each other's identities. Diffie–Hellman key exchange
solves this problem. First, Alice and Bob publicly agree on a
cryptographic group

This simple protocol has been in use, with only minor modifications,
since the 1970s. The challenge is to create examples of groups

The classic example of a group suitable for the Diffie–Hellman protocol
is the multiplicative group of a finite field

This is where Jacobians of algebraic curves come into their own.
First, elliptic curves and Jacobians of genus 2 curves do not have a
subexponential index calculus algorithm: in particular, from the point
of view of the DLP, a generic elliptic curve is currently as
strong as a generic group of the same size. Second, they provide
some diversity: we have many degrees of freedom in choosing
curves over a fixed

Theme: Coding theory

Coding Theory studies originated with the idea of using redundancy in messages to protect against noise and errors. The last decade of the 20th century has seen the success of so-called iterative decoding methods, which enable us to get very close to the Shannon capacity. The capacity of a given channel is the best achievable transmission rate for reliable transmission. The consensus in the community is that this capacity is more easily reached with these iterative and probabilistic methods than with algebraic codes (such as Reed–Solomon codes).

However, algebraic coding is useful in settings other than the Shannon context. Indeed, the Shannon setting is a random case setting, and promises only a vanishing error probability. In contrast, the algebraic Hamming approach is a worst case approach: under combinatorial restrictions on the noise, the noise can be adversarial, with strictly zero errors.

These considerations are renewed by the topic of list decoding after the breakthrough of Guruswami and Sudan at the end of the nineties. List decoding relaxes the uniqueness requirement of decoding, allowing a small list of candidates to be returned instead of a single codeword. List decoding can reach a capacity close to the Shannon capacity, with zero failure, with small lists, in the adversarial case. The method of Guruswami and Sudan enabled list decoding of most of the main algebraic codes: Reed–Solomon codes and Algebraic–Geometry (AG) codes and new related constructions “capacity-achieving list decodable codes”. These results open the way to applications against adversarial channels, which correspond to worst case settings in the classical computer science language.

Another avenue of our studies is AG codes over various geometric objects. Although Reed–Solomon codes are the best possible codes for a given alphabet, they are very limited in their length, which cannot exceed the size of the alphabet. AG codes circumvent this limitation, using the theory of algebraic curves over finite fields to construct long codes over a fixed alphabet. The striking result of Tsfasman–Vladut–Zink showed that codes better than random codes can be built this way, for medium to large alphabets. Disregarding the asymptotic aspects and considering only finite length, AG codes can be used either for longer codes with the same alphabet, or for codes with the same length with a smaller alphabet (and thus faster underlying arithmetic).

From a broader point of view, wherever Reed–Solomon codes are used, we can substitute AG codes with some benefits: either beating random constructions, or beating Reed–Solomon codes which are of bounded length for a given alphabet.

Another area of Algebraic Coding Theory with which we are more recently concerned is the one of Locally Decodable Codes. After having been first theoretically introduced, those codes now begin to find practical applications, most notably in cloud-based remote storage systems.

Theme: Cryptography

A huge amount of work is being put into developing an efficient quantum computer. But even if the advent of such a computer may wait for decades, it is urgent to deploy post-quantum cryptography (PQC), i.e: solutions on our current devices that are quantum-safe. Indeed, an attacker could store encrypted sessions and wait until a quantum computer is available to decrypt.
In this context the National Institute of Standard Technology (NIST) has launched in 2017 (see this website) a call for standardizing public-key PQC schemes (key exchanges and signatures). Among the mathematical
objects to design post quantum primives, one finds error
correcting codes, Euclidean lattices and isogenies.

We are currently in the final step of the standardization of the NIST and most of the selected solutions are based on codes and lattices. These preliminary results tend to show that codes and lattices will be in a near future at the ground of our numerical security. If isogenies are less represented, they remain of deep interest since they appear to be the post quantum solution providing the smallest key sizes. The purpose of our research program is to bring closer these solutions for a post-quantum security in order to improve their efficiency, diversity and to increase our trust in these propositions.

We are interested in developing some interactions between cryptography and cybersecurity. In particular, we develop some researches in embedded security (side channels and fault attack), software security (finding vulnerability efficiently) and privacy (security of TOR).

The huge hype about blockchains attracted the attention of many companies towards advanced cryptographic protocols. While basic and standard blockchain ideas rely, on the cryptographic side, on very basic and standard cryptographic primitives like signature and hash functions, more elaborate techniques from crypto could alleviate some shortcomings of blockchain, like the poor bandwith and the lack of privacy.

Team Grace is investigating two topics in these areas: secure multiparty computation and verifiable computation.

Secure multiparty computation enables several participants to compute a common function of data they each secretly own, without each participant revealing his data to the other participants. This area has seen great progress in recent years, and the cryptogaphic protocols are now mature enough for practical use. This topic is new to project-team Grace, and we will investigate it in the context of blockchains. Daniel Augot is involved in blockchains from the point of view of cryptography for better blockchains, mainly for improving privacy. A PhD student has been enrolled at IRT System-X, to study pratical use cases of Secure Multiparty Computtiton in the context of blockchains.

The topic of verifiable computation consists in verifying heavy computations done by a remote computer, using a lightweight computer which is not able to do the computation. The remore computer, called the prover, is allowed to provided a proof aside the result of the computation. This proof must be very short and fast to verify. It can also be made zero-knowledge, where the prover hides some inputs to the computation, and yet prove the result is correct.

There are two competing propositions which provide a mathematical and algorithmic background for these proof techniques: one based on a line of research dating back to the celebrated PCP theorem (from algorithmic complexity theory, using error correcting codes), and one base on the discrete logarithm problem and pairing based protocols (algorithmic number theory and elliptic curves over finite fields). Daniel Augot is advising Sarah Bordage on the first topic, also known as “STARKS” (Scalable Transparent Arguments of Knowledge), and François Morain is advising Youssef El Housni on the second topic, known as “SNARKS” (Succint Non Interactive Arguments of Knowledge).

These proofs allows to move computation off chain, pushing the burden to off chain servers, who then post results onchain, accompanied by short and easy to verify proofs onchain. This is one of the promising paths for scalability. Also, including zero-knowledge in these proofs provides privacy.

Also Daniel Augot, together with Julien Prat (economist, ENSAE), is
co-leading a Polytechnique teaching and research “chair”, called Blockchain and B2B plaforms, funded by
CapGemini, Caisse des dépots and NomadicLabs, for blockchains in the
industry, B2B platforms, supply chains, etc.

The team is concerned with several aspect of reliability and security of cloud storage, obtained mainly with tools from coding theory. On the privacy side, we build protocols for so-called Private Information Retrieval which enable a user to query a remote database for an entry, while not revealing his query. For instance, a user could query a service for stock quotes without revealing with company he is interested in. On the availability side, we study protocols for proofs of retrievability, which enable a user to get assurance that a huge file is still available on a remote server, with a low bandwith protocol which does not require to download the whole file. For instance, in a peer-to-peer distributed storage system, where nodes could be rewarded for storing data, they can be audited with proof of retrievability protocols to make sure they indeed hold the data.

We investigate these problems with algebraic coding theory for the effective constuction of protocols. To this respect, we mainly use locally decodable codes and in particular high-rate lifted codes.

Maxime Roméas is a PhD student of the team. (PhD grant from IP Paris/Ecole Polytechnique for a 3-year doctorate, Oct 2019-Sept 2022). The subject of his thesis is "The Constructive Cryptography paradigm applied to Interactive Cryptographic Proofs".

The Constructive Cryptography framework, introduced by Maurer in 2011, redefines basic cryptographic primitives and protocols starting from discrete systems of three types (resources, converters, and distinguishers). This not only permits to construct them effectively, but also lighten and sharpen their security proofs. One strength of this model is its composability. The purpose of the PhD is to apply this model to rephrase existing interactive cryptographic proofs so as to assert their genuine security, as well as to design new proofs. The main concern here is security and privacy in Distributed Storage settings. Another axis of the PhD is to augment the CC model by, e.g., introducing new functionalities to a so-called Server Memory Resource.

Besides the Hamming metric, other metrics have been considered for building error correcting codes. In particular, rank metric codes, and more specifically the so-called Fqm-linear codes, have found their way to code-based cryptography because they allow a more compact description and therefore smaller keys for comparable efficiency. Gabidulin codes, which are the rank-metric analogue of Reed-Solomon codes, come with a strong algebraic structure and efficient algorithms to decode uniquely up to half the minimum distance. However, contrary to their Hamming metric counterpart, they lack from a list-decoding algorithm, and therefore it is considered as a hard problem to decode beyond this bound. Based on this hard problem, two somehow dual encryption schemes with short keys had been recently proposed: LIGA and RAMESSES.

In 21, we analyse the security of these two cryptosystems, and show that in both cases the ciphertext could also be seen as a codeword from a bigger code corrupted by a small error. We then extend a decoding algorithm for Gabidulin codes to any code containing a Gabidulin code, at the cost of a decrease in the decoding radius, which was enough to recover the plaintext from the ciphertext and public data. We furthermore propose an implementation of our algorithm and of the attack on LIGA in SageMath.

We give a quantum reduction from finding short codewords in a random linear code to decoding for the Hamming metric. This is the first time such a reduction (classical or quantum) has been obtained. Our reduction adapts to linear codes Stehlé-Steinfield-Tanaka- Xagawa’s re-interpretation of Regev’s quantum reduction from finding short lattice vectors to solving the Closest Vector Problem. The Hamming metric is a much coarser metric than the Euclidean metric and this adaptation has needed several new ingredients to make it work. For instance, in order to have a meaningful reduction it is necessary in the Hamming metric to choose a very large decoding radius and this needs in many cases to go beyond the radius where decoding is unique. Another crucial step for the analysis of the reduction is the choice of the errors that are being fed to the decoding algorithm. For lattices, errors are usually sampled according to a Gaussian distribution. However, it turns out that the Bernoulli distribution (the analogue for codes of the Gaussian) is too much spread out and can not be used for the reduction with codes. Instead we choose here the uniform distribution over errors of a fixed weight and bring in orthogonal polynomials tools to perform the analysis and an additional amplitude amplification step to obtain the aforementioned result.

The result is presented in the preprint 44.

In 14, we have proposed an adaptation of the algorithmic reduction theory of lattices to binary codes. This includes the celebrated LLL algorithm (Lenstra, Lenstra, Lovasz, 1982), as well as adaptations of associated algorithms such as the Nearest Plane Algorithm of Babai (1986). Interestingly, the adaptation of LLL to binary codes can be interpreted as an algorithmic version of the bound of Griesmer (1960) on the minimal distance of a code. Using these algorithms, we demonstrate —both with a heuristic analysis and in practice— a small polynomial speed-up over the Information-Set Decoding algorithm of Lee and Brickell (1988) for random binary codes. This appears to be the first such speed-up that is not based on a time-memory trade-off. The above speed-up should be read as a very preliminary example of the potential of a reduction theory for codes, for example in cryptanalysis.

This work 37 has presented the first full implementation of Wave, a postquantum code-based signature scheme. We define Wavelet, a concrete Wave scheme at the 128-bit classical security level (or NIST postquantum security Level 1) equipped with a fast verification algorithm targeting embedded devices. Wavelet offers 930- byte signatures, with a public key of 3161 kB. We include implementation details using AVX instructions, and on ARM Cortex-M4, including a solution to deal with Wavelet’s large public keys, which do not fit in the SRAM of a typical embedded device. Our verification algorithm is approximately 4.65 times faster then the original, and verifies in 1 087 538 cycles using AVX instructions, or 13 172 ticks in an ARM Cortex-M4.

Bringing practical post-quantum security to low-end IoT devices is a pressing challenge. In 38 we evaluate a range of pre- and post-quantum secure signature schemes in the context of SUIT software updates (specified by the IETF), on three popular, off-the-shelf microcontroller boards (ARM Cortex-M4, ESP32, and RISC-V) that are representative of the 32-bit landscape. We show that upgrading to postquantum security is practical now, and reflect on the best choices for various use cases. This work has been selected for presentation at Real World Crypto 2022.

LAC is a Ring Learning With Error based cryptosystem that has been proposed to the NIST call for post-quantum standardization and passed the first round of the submission process. It did not pass to the third round but it is selected as the chinese standard for key exchange. The particularity of LAC is to use an error-correction code ensuring a high security level with small key sizes and small ciphertext sizes. LAC team proposes a CPA secure cryptosystem, LAC-CPA, and a CCA secure one, LAC-CCA, obtained by applying the Fujisaki-Okamoto transformation on LAC-CPA.

Together with Aurelien Greuet (IDEMIA), we study
in 57 the security of LAC Key Exchange (KE) mechanism, using LAC-CPA, in a misuse context: when the same secret key is reused for several key exchanges and an active adversary has access to a mismatch oracle. This oracle indicates information
on the possible mismatch at the end of the KE protocol. In this context, we show that an attacker needs at most 8 queries to the oracle to retrieve one coefficient of a static secret key. This result has been experimentally confirmed using the reference and optimized
implementations of LAC. Since our attack can break the CPA version in a misuse context, the Authenticated KE protocol, based on the CCA version, is not impacted. However, this research provides a tight
estimation of LAC resilience against this type of attacks.

Together with Cyprien Delpech de Saint Guilhem (KU Leuven), Tako Boris Fouotsa (Universit`a Degli Studi Roma Tre), Peter Kutas (University of Birmingham), Christophe Petit (Université Libre de Bruxelles), Javier Silva ( Universitat Pompeu Fabra)and Benjamin Wesolowski (Institut Mathématiques de Bordeaux), Luca de Feo and Antonin Leroux have introduced a new post-quantum public key encryption scheme that uses constructively the torsion point attack against the SIDH key exchange. The publication includes an implementation in C of this new construction. Another contribution of this work is the "uber-isogeny assumption" which aims at generalizing some computational assumption encountered in various scheme of the literature.

Together with Daniel J. Bernstein, Fabio Campos, Tung Chou, Tanja Lange, Michael Meyer, and Jana Sotáková, this work 7 implements the fastest implementation of commutative supersingular isogeny-based key exchange (CSIDH) with basic side-channel protection to date. This new speed record is partly due to a redesign of the private key space, while maintaining compatibilty with existing CSIDH software. This work was published in TCHES 2021.

CSIDH—Commutative Supersingular Isogeny Diffie–Hellman—is a post-quantum non-interactive key exchange (NIKE) algorithm based on the action of a certain ideal class group on the set of supersingular elliptic curves defined over

. In

10, we show that CSIDH is just one protocol in a family of more general group actions parameterized by positive squarefree integers

, with CSIDH corresponding to the case

.

The classical equivalence of the Computational Diffie–Hellman and Discrete Logarithm Problems is a long-standing problem at the foundations of group-based public-key cryptography. Moving to the post-quantum paradigm of group actions, where CSIDH takes the place of Diffie–Hellman, it is important to understand the relationship between the analogues of the DLP and CDHP. With Steven Galbraith, Lorenz Panny, and Frederik Vercauteren, we show in

16that there is a polynomial-time quantum equivalence between these problems.

Suppose a user of a small device requires a powerful computer to perform a heavy computation for him. The computation can not be performed by the device. After completion of the computation, the powerful computer reports a result. Suppose now that the user has not full confidence that the remote computer performs correctly or behaves honestly. How can the user be assured that the correct result has been returned to him, given that he can not redo the computation ?

The topic of verifiable computation deals with this issue. Essentially it is a cryptographic protocol where the prover (i.e. the remote computer) provides a proof to the verifier (i.e. the user) that a computation is correct. The protocol may be interactive, in which case there may be one or more rounds of interactions between the prover and the verifier, or non interactive, in which case the prover sends a proof that the computation is correct.

These protocols incorporate zero-knowledge variants, where the scenario is different. A service performs a computation on date, part of which remaining private (for instance statistics on citizen's incomes). It is possible for the service to prove the correctness of the result without revealing the data (which has to be committed anyway).

The two main venues for building these protocols are the setting of discrete logarithms (and pairings) in elliptic curves and a coding theoretical setting (originating to the PCP theorem). Both variants admit a zero-knowledge version, and the core of the research is more on provable computation than the zero-knowledge aspect, which comes rather easily in comparison.

In the coding theoretic setting, these protocols are made popular, in particular in the blockchain area, under the name of (ZK-)STARKS, Scalable Transparent Arguments of Knowledge, introduced in 2018. In theoretical computer science, these proofs are derived for protocols which are called IOPs Interactive Oracle Proofs, which are combination of IPs Interactive Proofs and PCPs Probabilistically Checkable Proofs, for combining the best of both worlds, and making PCPs pratical.

At the core of these protocols lies the following coding problem: how to decide, with high confidence, that a very long ambient word is close to a given code, while looking at very few coordinates of it.

These protocols were originally designed for the simplest algebraic codes, Reed-Solomon codes. Daniel Augot and Sarah Bordage provided a generalization of these protocols to multivariate codes, i.e. product of Reed-Solomon codes and Reed-Muller codes. The performance does not degrade badly with respect to the basic Reed-Solomon case 36. It remains to assert the revelance of these codes for building proof systems and to compare to litterature, where product of Reed-Solomon codes have been studied for more than twenty years.

A very important issue is have a smaller alphabet, and this can be done using algebraic-geometric codes. This was done by Sarah Bordage and Jade Nardi 40, using curves with a resoluble automorphims group, which enable to build codes which are foldable in way similar to the Reed-Solomon codes with are folded in the "FRI" protocol 52. Their protocol has very good perfomance, akin to the Reed-Solomon case.

Verifiable computation can also be built using the theory of ellitpic curves, the hardness of the discrete logarithms, and pairings, as introduced in 58 and made practical in 60. These proofs are much more shorter than the ones provided by the STARKS, with a higher cost for the prover. Furthermore, these systems are not post-quantum, and there are important issues in the setting of the proof system, where a trusted third party is required.

The verifiable computation problems leads to several new questions in elliptic curves cryptographic, since the required operations depart from the standard ones used for instance in signature algorithms.

A very interesting topic is the notion of "proof of proofs". Essentially, verifying a proof is a computation, and a proof that a proof has been verified can be given. The same idea applies for verifying hundreds of proofs. A single proof can report that hundred of proofs have been checked.

This is very strong in the elliptic curve setting because the size of a proof is a constant (a few hundred bytes, only depending on the security parameter, not the computation). This means that the above hundred of statements admits a very short proof. In the blockchain world, this translates into a very short proof that many offchain transactions are correct.

To achieve this goal, this requires an ellitpic curve for proving computations done over an other elliptic curve. The problem is that there is an arithmetic mismatch: the statement which is to be proved is defined over

In collaboration with Aurore Guillevic, Youssef El Housni provided curves which are very efficient for this recursion 56, 45. These curves beat the competition, an implementation has been provided here. Some other blockchain players CELO, Consensys also have used these curves in their implentations of verifiable computation and zero-knowledge proofs.

In collaboration with Matthieu Rambaud (Télécom Paris), Daniel Augot is advising Angelo Saadeh. The issue which is adressed is the following. Two parties each hold privately some distinct slices of common data. compute a logistic regression on the whole set of data, without each party revealing its data to the other party.

Computing a common output from inputs of several participants in the above is done in cryptography using MPC Secure Multiparty Computation, as introduced by Yao 61, and made recently practical, with several implementations. Yet, as classically observed in MPC, the actual result, when learned, may leak information about the secret inputs. The same problem occurs here, where the model may leak information about the data.

Thus it is natural to investigate the use of

The topic of MPC enables several participants to obtain a common result of a computation of each one's data, while not revealing data of others participants, without any trusted third party. This seems quite related to the blockchain philosophy, where decentralisation and trustless environments are at the core of the claimed properties of blockchains.

Actually, this is not so clear, since MPC deals with privacy and secret data, while blockchains typically imply transparency and public data. A PhD, funded by System-X, studies this possible interactions, and a model is under design. We take the idea that blockchains can enable to allocate jobs to "workers", provide them a reward for doing so, and notarize the result in the ledger. MPC would complement this by having "MPC workers" which, under the security models of MPC, could do jobs on private data submitted by clients (this could be called MPC-as-a-service). In 19, we showed that an implementation of these ideas are pratical, bsed on the work of Benhamouda et al 59, with improvements with respect to the hyperledger/fabric blockchain platform, and integrating into it the Scale-Mamba MPC library.

We revisit the issue of efficient and secure Proofs of Retrievability (PoR). To do so, we build upon the work of Maurer et al. on Constructive Cryptography (CC) to give a clearer and more usable security framework for PoR protocols. We propose a scheme based of Locally Correctable Codes (LCC), and assert its security by giving an explicit formula for the probability of an adversary to fool the client who outsourced his data on the remote server. Our scheme has reduced storage overhead and communication complexity, compared to a previous scheme of Lavauzelle and Levy-dit-Vehel, which was also based on LCCs. We achieve better parameters by introducing a new definition and construction of so-called "authentic Server Memory Resource" (aSMR) in the CC context. It has to be noted that our aSMR can be used as an intermediate step in all code-based protocols for outsourced storage. We also model LCCs in a composable framework. Doing so, we show that the exact failure probability of the local decoder depends not only on the number of corrupted symbols, but also on their locations. For the important class of lifted Reed-Solomon codes, we prove that this failure probability can be computed in polynomial time in the length of the lifted code. A paper on this work has been submitted to STACS 2022.

Cornacchia's algorithm is an important building block of CM elliptic curve cryptography. Sharing many properties with fast integer gcd algorithms, we worked on a fast version for this tool. A paper is to be submitted at ISSAC'2022 and the code is to be available on gitlab.

One of the most powerful factoring algorithm is ECM that uses elliptic curves. To improve it, families of curves are traditionally built over the rationals. In this work, number fields are used to treat the special numbers

. See the preliminary results in

48.

There does not exist Reed-Muller codes for the rank matric over finite fields, due the simple cyclic structure of Galois group of finite fields. However when the Galois does not have a cyclic structure, for instance for some extension fields of the rational numbers, and using the langage of skew polynomials, it is possible to build rank-matrix analogues of classical Reed-Muller codes

6, with rank minimum distance similar to the one of classical Reed-Muller codes.

Groups of unknown order are a classic setting for asymmetric cryptosystems—RSA being the most famous example. In recent times, unknown-order groups have returned to prominence as a setting for new, advanced cryptosystems including accumulators and VDFs (Verifiable Delay Functions). In these applications,

becomes critical: not even the constructor of the group should know its order. In

15(joint work with Samuel Dobson and Steven Galbraith), we re-evaluate the security of ideal class groups—the most popular source of trustless unknown-order groups—and show that generally accepted parameters do not meet claimed security levels. We also propose a more efficient alternative: Jacobians of genus-3 hyperelliptic curves.

is a European H2020 competence network:

.

Website: sparta.eu

We have participated in the HAII-T Program (High-Assurance Intelligent Infrastructure Toolkit), developing high-performance, high-assurance cryptographic software for IoT devices, especially in the context of the free, open-source RIOT operating system.
While SPARTA will end in early 2022, this work is continuing in the framework of the Inria Défi RIOT-FP, which aims to provide proven, future-proof—and, in particular, post-quantum—security for RIOT-OS and its users.

ANR

(Cryptography, Isogenies, and Abelian varieties Overwhelming) is a JCJC 2019 project, led by Damien Robert (Inria EP LFANT). This project, which started in October 2019, will examine applications of higher-dimensional abelian varieties in isogeny-based cryptography.

ANR

(Code–based Cryptography) This is a project from (

). This project, starting in october 2017 led by Jean-Pierre Tillich (Inria, EP Cosmiq) focusses on the design and the security analysis of code–based primitives, in the context of the current

NIST competition.

ANR

(An interface between COde and LAttice-based cryptography) is a project from (

). This project (ANR JCJC), starting in october 2021 led by Thomas Debris-Alazard focusses on bringing closer post-quantum solutions based on codes and lattices to improve our trust in cryptanalysis and to open new perspectives in terms of design.

BARRACUDA is a collaborative ANR project accepted in 2021 and led by
A. Couvreur.

Website : barracuda.inria.fr

The project gathers specialists of coding and cryptology on one hand and specialists of number theory and algebraic geometry on the other hand. The objectives concern problems arising from modern cryptography which require the use of advanced algebra based objects and techniques. It concerns for instance mathematical problems with applications to distributed storage, multi-party computation or zero knowledge proofs for protocols.

SANGRIA is a collaborative ANR project accepted in 2021.

Website : lip6.fr/Damien.Vergnaud/projects/sangria/

The main scientific challenge of the SANGRIA (Secure distributed computAtioN - cryptoGRaphy, combinatorIcs and computer Algebra) project are (1) to construct specific protocols that take into account practical constraints and prove them secure, (2) to implement them and to improve the efficiency of existing protocols significantly. The SANGRIA project (for Secure distributed computAtioN: cryptoGRaphy, combinatorIcs and computer Algebra) aims to undertake research in these two aspects while combining research from cryptography, combinatorics and computer algebra. It is expected to impact central problems in secure distributed computation, while enriching the general landscape of cryptography.

MobiS5 is a collaborative ANR project accepted in 2018.

Website : mobis5.limos.fr/

MobiS5 will aim to foresee and counter the threats posed in 5G architectures by the architectural modifications suggested in TR 22.861-22.864. Concretely, we will provide a provably-secure cryptographic toolbox for 5G networks, validated formally and experimentally, responding to the needs of 5G architectures at three levels:

* Challenge 1: security in the network infrastructure and end points: including core network security and attack detection and prevention; * Challenge 2: cryptographic primitives and protocols, notably : a selection of basic primitives, an authenticated key-exchange protocol, tools to compute on encrypted data, and post-quantum cryptographic countermeasures * Challenge 3: mobile applications, specifically in the use-case of a secure server that aids or processes outsourced computation; and the example of a smart home.

CryptiQ is a collaborative ANR project accepted in 2018.

The goal of the CryptiQ project is to major changes due to Quantum Computing by considering three plausible scenarios, from the closest to the furthest foreseeable future, depending on the means of the adversary and the honest parties. In the first scenario, the honest execution of protocols remains classical while the adversary may have oracle access to a quantum computer. This is the so-called post-quantum cryptography, which is the best known setting. In the second scenario (quantum-enhanced classical cryptography), we allow honest parties to have access to quantum technologies in order to achieve enhanced properties, but we restrict this access to those quantum technologies that are currently available (or that can be built in near-term). The adversary is still allowed to use any quantum technology. Finally, in the third scenario (cryptography in a quantum world), we allow the most general quantum operations to an adversary and we consider that anybody can now have access to both quantum communication and computation.