## Section: New Results

### Proof-theoretical investigations

Participants : Federico Aschieri, Pierre Boutillier, Pierre-Louis Curien, Hugo Herbelin, Danko Ilik, Guillaume Munch-Maccagnoni, Pierre-Marie Pédrot, Alexis Saurin, Arnaud Spiwack, Noam Zeilberger.

#### Sequent calculus and Computational duality

*Thunks and duality.*
Guillaume Munch-Maccagnoni investigated a notion dual to the thunks of call-by-value lambda-calculus which allows to simulate call-by-value in call-by-name. He started to investigate how this structure arises in many models of call-by-name lambda-calculus, how it could explain various phenomena such as storage operators, and how it could relate to features of actual programming languages.

*Categorical semantics.*
Guillaume Munch-Maccagnoni started a collaboration with Marcelo Fiore with the aim of understanding structures behind sequent calculus (of system L in particular) and polarisation (as found in Girard's classical logic). The goal is to draw links with algebraic and/or computing structures in the unifying language of category theory. This work is also in collaboration with Pierre-Louis Curien.

Noam Zeilberger has continued to work with Paul-André Melliès (PPS) on developing a categorical framework for better understanding contexts and inference rules in proof theory and type theory, with the aim of achieving an integration with the theory of side-effects in programmming languages. He presented some results from this work in March at the European Workshop on Computational Effects.

*Polarised Peano arithmetic.*
Guillaume Munch-Maccagnoni extended polarised classical logic and polarised classical realisability to predicate calculus and to Peano arithmetic. This decomposes and simplifies technical artefacts found in call-by-name classical realisability, and sheds a new light on witness extraction from proofs of $\Sigma $ formulae.

*(Co)Inductive Types in Sequent Calculus.*
Hugo Herbelin and Jeffrey Sarnat continued their work towards a sequent calculus presentation of a simply-typed fragment of CIC that has inductive and coinductive types, as formalized using recursion operators and a guard condition. Some progress was made on the formalization of the guard condition and normalization proof, but both remained unfinished as of the end of Sarnat's postdoc in June.

*Classical call-by-need and the duality of computation.*
In a collaboration with Zena Ariola (and especially after a 3-week visit by Alexis Saurin in Oregon early 2011), Zena Ariola, Hugo Herbelin and Alexis Saurin have presented the call-by-need strategy in the framework of *the duality of computation*, that is a sequent calculus approach to call-by-need, and they extended call-by-need from minimal logic to classical logic which allowed to integrate smoothly control operators, resulting in particular in a call-by-need $\lambda \mu $-calculus. Moreover, the duality principles involved in such a framework unveiled a new calculus, dual to the usual call-by-need but which is distinct from call-by-name, call-by-value as well as the usual call-by-need. These results were presented at TLCA 2011.

Pursuing the previous collaboration, the three previous authors, together with Paul Downen and Keiko Nakata, studied abstract amchines for this classical call-by-need calculus.

#### Linear dependent types

Arnaud Spiwack started investigating dependent types variants of linear sequent calculus based on Curien & Herbelin's $\mu \tilde{\mu}$. The goal is to study what kind of set theory arises from such a linear type theory when following the tradition of intuitionistic type theory of defining a set as a type equipped with a relation (this construction is also known as *setoid*). Sets defined in this manner in Coq gives rise to a quasitopos (as proved in Arnaud Spiwack's PhD thesis) which makes for a reasonable approximation of usual mathematics, however “linear sets” should be quite different and may support some unorthodox mathematics. To have an appropriate theory of linear sets seems to require a fairly rich linear type theory in particular one that supports so called *strong elimination* (the type theoretic equivalent to induction). So far, if extending $\mu \tilde{\mu}$ to dependently typed linear logic has been achieved, strong elimination proves to be harder to coin.

Pierre-Marie Pédrot just started his PhD on this same general topic. While linear logic appeared as an operational decomposition of intuitionistic logic, dependent types are conversely an essential enrichment of the latter, as they permit to constructively formalize important parts of mathematics. Even though it is nowadays pervasively thought that we should mix them, actually it does seem that nobody seriously tried hitherto. This subject is quite vast, and both sides may mutually enhance the understanding of one another. From the linear side, linear logic is seriously lacking any satisfying syntax at all; and worse, it seems more prone to solely describe computational behavior instead of truly formalizing mathematics, for its very lack of richer types. From the dependent side, usual dependent type systems are based upon PTS, which, being a plain enrichment of basic lambda-calculus, are intrinsically call-by-name structures. Hence, a linear decomposition inspired from polarization techniques may permit a better analysis of the inner, yet-to-be discovered gears of PTS. One may even hope to include non-intuitionistic effects therein. Furthermore, practical systems used today (Coq, for example) come bundled with additional constructs, such as inductives, whose understanding with respect to models is still highly incomplete. One could expect linear logic to shed a new light upon these issues. This thesis stems from preliminary work of Pierre-Marie Pédrot (during his M2 internship) inspired by models of GoI and other closely related results from Girard, which suggest a natural way to integrate dependency into them.

#### Proving with side-effects

*Axiom of dependent choice.*
Hugo Herbelin showed that classical arithmetic in finite types
extended with strong elimination of existential quantification proves
the axiom of dependent choice. To get classical logic and choice
together without being inconsistent is made possible first by
constraining strong elimination of existential quantification to
proofs that are essentially intuitionistic and second to turn
countable universal quantification into an infinite conjunction of
classical proofs evaluated along a call-by-need evaluation strategy so
as to extract of them intuitionistic contents that complies to the
intuitionistic constraint put on strong elimination of existential
quantification. This work has been presented at the TYPES conference.

*Memory assignment, forcing and delimited control.*
Hugo Herbelin investigated how to extend his work on
intuitionistically proving Markov's
principle [35] and the work of Danko
Ilik on intuitionistically proving the double negation shift
(i.e. $\forall x\phantom{\rule{0.166667em}{0ex}}\neg \neg A\to \neg \neg \forall x\phantom{\rule{0.166667em}{0ex}}A$) [38] to other
kind of effects. In particular, memory assignment is related to
Cohen's forcing as emphasized by
Krivine [40] and as the
observation that Cohen's translation of formula $P$ into $\forall y\le x\exists z\le y\phantom{\rule{0.166667em}{0ex}}P\left(z\right)$ is similar to a state-passing-style
transformation of type $P$ into $S\to S\times P$.

Hugo Herbelin then designed a logical formalism with memory assignment
that allows to *prove* in direct-style any statement provable
using the forcing method, the same way as logic extended with control
operators allows to support direct-style classical reasoning. Thanks
to the use of delimiters over “small” formulas similar to the
notation of ${\Sigma}_{1}^{0}$-formulas in arithmetic, the whole framework
remains intuitionistic, in the sense that it satisfies the disjunction
and existence property.

Two typical applications of proving with side-effects are global-memory proofs of the axiom of countable choice and an enumeration-free proof of Gödel's completeness theorem.

The main ideas of this research program have been presented at the Geocal-Lac meeting of the GDR IM.

#### Delimited control

*Delimited control and infinitary/stream calculi*

During his summer intership, Paul Downen studied with Alexis Saurin some infinitary $\lambda $-calculi and infinitary rewriting and in particular a proposal by Ketema, Blom, Aoto and Simonsen which allowes to consider transfinitely deep terms. The proposed calculus presented several defects on which Paul Downen studies focused. Some of these defects were corrected but the work is still on-going.

In a collaboration with Marco Gaboardi and Koji Nakazawa, Alexis Saurin has been studying how to turn the $\Lambda \mu $-calculus into a truly stream-based calculus. This involed enlarging the syntactical catergory for streams, defining a type system and comparing with other proposals for computing on streams.

Alexis Saurin also developed previous results on $\Lambda \mu $-calculus in a paper currently under final revision for publication in TCS.

*PTS and delimited control*
Following Danvy-Filinski' simply-typed system for a $\lambda $-calculus with
delimited control, Hugo Herbelin and Pierre Boutillier have defined a set of rules for
pure type systems with control operator. The work relies on the CPS used to encode
them in standard Pure Type Systems and involves extra type
annotations. Nevertheless, it seems to be more general than previous attempts to
build classical PTS [20] . It has been presented at the
workshop TPDC (Theory and Practice of Delimited Continuations) in Novi Sad and an article is under preparation.

#### Interactive Realizability

Thanks to the Curry-Howard correspondence for classical logic, it is possible to extract programs from classical proofs. These programs use control operators as a way to implement backtracking and processes of intelligent learning by trial and error. Unfortunately, such programs are often hard to write, difficult to understand and are *very inefficient*: every time a program backtracks, it forgets way too much information. This state of thing is due to a poor understanding and control of the backtracking mechanism that interprets classical proofs. In order to write down more efficient programs, it is necessary to describe exactly: a) what the programs learn, b) how the knowledge of programs varies during the execution.

A first step towards this goal is the theory of Interactive Realizability, a semantics for intuitionistic Arithmetic with excluded middle over semi-decidable predicates. It is based on a notion of state, which describes the knowledge of programs coming from a classical proof, and explains how the knowledge evolves during computation.

Federico Aschieri is working in two directions. First he is extending this realizability semantics to second-order intuitionistic Arithmetic with the same excluded middle over semi-decidable predicates. He has also discovered a new state passing style transformation, which allows to implement in System F efficient programs, which backtrack at the right point and do not forget anything when backtracking. He is also investigating an interesting relation with the forcing semantics: it seems that his transformation is a very direct, new constructive formulation of forcing.

Secondly, in collaboration with Berardi, he is also extending Interactive Realizability to first-order Arithmetic, with full excluded middle. This work promises to provide a significantly finer description of the learning mechanism that interprets classical proofs.

#### Substitutions and isomorphisms

Pierre-Louis Curien extended his collaboration with Martin Hofmann (Univ. of Munich) and Richard Garner (MacQuarie University, Sydney), started in 2010, to the point where the picture sought and announced in the report of last year turned out to be a bit less idyllic than expected. Let us recall the question addressed. We wanted to compare precisely two ways of giving a categorical interpretations of Martin-Löf type theory, both overcoming the following mismatch: syntax has exact substitutions, while their categorical interpretation, in terms of pullbacks or fibrations, “implements” substitutions only up to isomorphism. One can then either change the model (strictification) [36] , or modify the syntax (by introducing explicit substitutions and more importantly explicit coercions between types that are now only isomorphic) [2] . These approaches turn out to be related through adjunctions in a suitable 2-categorical framework that has a conceptual interest of its own. But these adjunction do not fit entirely together, as we found out early 2011: One is base-dependent, and the other not. So we cannot put them directly aside to get the nice conceptual picture we hoped for. Still, our initial goal of expressing the interpretations in terms of on another can be attained, but this remains to be worked out in detail.

#### Miscellanea

During his three months visit in Beijing, Pierre-Louis Curien worked on the relations between Rewriting theory and the theory of Gröbner bases and other bases like Janet bases or involutive bases that have been introduced in computer algebra. These comparisons shed some light on the classification of various completion techniques for rewriting systems (a completion turns a rewriting system into an equivalent locally confluent one). This is work in progress.