The general orientation of our team is described by the short name given to it:
Special Functions, that is, particular mathematical functions that have
established names due to their importance in mathematical analysis, physics, and
other application domains. Indeed, we ambition to study special functions with
the computer, by combined means of computer algebra and formal methods.

Computer-algebra systems have been advertised for decades as software
for “doing mathematics by computer” 86. For
instance, computer-algebra libraries can uniformly generate a corpus
of mathematical properties about special functions, so as to display
them on an interactive website. This possibility was recently shown by the
computer-algebra component of the
team 40. Such
an automated generation significantly increases the reliability of the
mathematical corpus, in comparison to the content of existing static
authoritative handbooks. The importance of the validity of these
contents can be measured by the very wide audience that such handbooks
have had, to the point that a book
like 37 remains one of the most cited
mathematical publications ever and has motivated the 10-year-long
project of writing its
successor 76.
However, can the mathematics produced “by computer” be considered as
true mathematics? More specifically, whereas it is nowadays
well established that the computer helps in discovering and observing
new mathematical phenomenons, can the mathematical statements produced
with the aid of the computer and the mathematical results computed by
it be accepted as valid mathematics, that is, as having the status of
mathematical proofs?
Beyond the reported weaknesses or
controversial design choices of mainstream computer-algebra systems,
the issue is more of an epistemological nature. It will not find its
solution even in the advent of the ultimate computer-algebra system:
the social process of peer-reviewing just falls short of evaluating
the results produced by computers, as reported by
Th. Hales 64 after the publication of his proof of
the Kepler Conjecture about sphere packing.

A natural answer to this deadlock is to move to an alternative kind of mathematical software and to use a proof assistant to check the correctness of the desired properties or formulas. The success of large-scale formalization projects, like the Four-Color Theorem of graph theory 59, the above-mentioned Kepler Conjecture 64, and the Odd Order Theorem of group theory 1, have increased the understanding of the appropriate software-engineering methods for this peculiar kind of programming. For computer algebra, this legitimates a move to proof assistants now.

The Dynamic Dictionary of Mathematical Functions
2
(DDMF) 40 is
an online computer-generated handbook of mathematical functions that
ambitions to serve as a reference for a broad range of applications.
This software was developed by the computer-algebra component of the
team as a project
3
of the MSR–Inria Joint Centre. It bases on a
library for the computer-algebra system Maple, Algolib
4, whose development
started 20 years ago in project-team Algorithms
5. As suggested by the constant
questioning of certainty by new potential users, DDMF deserves a
formal guarantee of correctness of its content, on a level that proof
assistants can provide. Fortunately, the maturity of
special-functions algorithms in Algolib makes DDMF a stepping stone
for such a formalization: it provides a well-understood and unified
algorithmic treatment, without which a formal certification would
simply be unreachable.

The formal-proofs component of the team emanates from another project
of the MSR–Inria Joint Centre, namely the Mathematical Components
project (MathComp)
6.
Since 2006, the MathComp group has endeavoured to develop
computer-checked libraries of formalized mathematics, using the
Coq proof assistant 83. The methodological
aim of the project was to understand the design methods leading to
successful large-scale formalizations. The work culminated in 2012 with the
completion of a formal proof of the Odd Order Theorem, resulting in
the largest corpus of algebraic theories ever machine-checked with a
proof assistant and a whole methodology
to effectively combine these components in order to tackle complex
formalizations. In particular, these libraries provide a good number of the many
algebraic objects needed to reason about special functions and their
properties, like rational numbers, iterated sums, polynomials, and a
rich hierarchy of algebraic structures.

The present team takes benefit from these recent advances to explore the formal certification of the results collected in DDMF. The aim of this project is to concentrate the formalization effort on this delimited area, building on DDMF and the Algolib library, as well as on the Coq system 83 and on the libraries developed by the MathComp project.

The following few opinions on computer algebra are, we believe, typical of computer-algebra users' doubts and difficulties when using computer-algebra systems:

As explained by the expert views above, complaints by computer-algebra users are often due to their misunderstanding of what a computer-algebra systems is, namely a purely syntactic tool for calculations, that the user must complement with a semantics. Still, robustness and consistency of computer-algebra systems are not ensured as of today, and, whatever Zeilberger may provocatively say in his Opinion 94 88, a firmer logical foundation is necessary. Indeed, the fact is that many “bugs” in a computer-algebra system cannot be fixed by just the usual debugging method of tracking down the faulty lines in the code. It is sort of “by design”: assumptions that too often remain implicit are really needed by the design of symbolic algorithms and cannot easily be expressed in the programming languages used in computer algebra. A similar certification initiative has already been undertaken in the domain of numerical computing, in a successful manner 66, 43. It is natural to undertake a similar approach for computer algebra.

Some of the mathematical objects that interest our team are still totally untouched by formalization. When implementing them and their theory inside a proof assistant, we have to deal with the pervasive discrepancy between the published literature and the actual implementation of computer-algebra algorithms. Interestingly, this forces us to clarify our computer-algebraic view on them, and possibly make us discover holes lurking in published (human) proofs. We are therefore convinced that the close interaction of researchers from both fields, which is what we strive to maintain in this team, is a strong asset.

For a concrete example, the core of Zeilberger's creative telescoping manipulates rational functions up to simplifications. In summation applications, checking that these simplifications do not hide problematic divisions by 0 is most often left to the reader. In the same vein, in the case of integrals, the published algorithms do not check the convergence of all integrals, especially in intermediate calculations. Such checks are again left to the readers. In general, we expect to revisit the existing algorithms to ensure that they are meaningful for genuine mathematical sequences or functions, and not only for algebraic idealizations.

Another big challenge in this project originates in
the scientific difference between computer algebra and formal proofs.
Computer algebra seeks speed of calculation on concrete
instances of algebraic data structures (polynomials, matrices,
etc). For their part, formal proofs manipulate
symbolic expressions in terms of abstract variables
understood to represent generic elements of algebraic data
structures. In view of this, a continuous challenge is
to develop the right, hybrid thinking attitude that is able to
effectively manage concrete and abstract values simultaneously,
alternatively computing and proving with them.

Applications in combinatorics and mathematical physics frequently involve equations of so high orders and so large sizes, that computing or even storing all their coefficients is impossible on existing computers. Making this tractable is an extraordinary challenge. The approach we believe in is to design algorithms of good—ideally quasi-optimal—complexity in order to extract precisely the required data from the equations, while avoiding the computationally intractable task of completely expanding them into an explicit representation.

Typical applications with expected high impact are the automatic discovery and algorithmic proof of results in combinatorics and mathematical physics for which human proofs are currently unattainable.

The implementation of certified symbolic computations on special functions in the Coq proof assistant requires both investigating new formalization techniques and renewing the traditional computer-algebra viewpoint on these standard objects. Large mathematical objects typical of computer algebra occur during formalization, which also requires us to improve the efficiency and ergonomics of Coq. In order to feed this interdisciplinary activity with new motivating problems, we additionally pursue a research activity oriented towards experimental mathematics in application domains that involve special functions. We expect these applications to pose new algorithmic challenges to computer algebra, which in turn will deserve a formal-certification effort. Finally, DDMF is the motivation and the showcase of our progress on the certification of these computations. While striving to provide a formal guarantee of the correctness of the information it displays, we remain keen on enriching its mathematical content by developing new computer-algebra algorithms.

Our formalization effort consists in organizing a cooperation between a computer-algebra system and a proof assistant. The computer-algebra system is used to produce efficiently algebraic data, which are later processed by the proof assistant. The success of this cooperation relies on the design of appropriate libraries of formalized mathematics, including certified implementations of certain computer-algebra algorithms. On the other side, we expect that scrutinizing the implementation and the output of computer-algebra algorithms will shed a new light on their semantics and on their correctness proofs, and help clarifying their documentation.

The appropriate framework for the study of efficient algorithms for
special functions is algebraic.
Representing algebraic theories as Coq formal libraries
takes benefit from the methodology emerging from the success of
ambitious projects like the formal proof of a major classification
result in finite-group theory (the Odd Order
Theorem) 58.

Yet, a number of the objects we need to formalize in the present context has never been investigated using any interactive proof assistant, despite being considered as commonplaces in computer algebra. For instance there is up to our knowledge no available formalization of the theory of non-commutative rings, of the algorithmic theory of special-functions closures, or of the asymptotic study of special functions. We expect our future formal libraries to prove broadly reusable in later formalizations of seemingly unrelated theories.

Another peculiarity of the mathematical objects we are going to manipulate
with the Coq system is their size. In order to provide a formal guarantee
on the data displayed by DDMF, two related axes of research have to be
pursued.
First, efficient algorithms dealing with these large objects have
to be programmed and run in Coq.
Recent evolutions of the Coq system to improve the efficiency of
its internal computations 38, 41 make this objective
reachable. Still, how to combine the aforementioned formalization
methodology with these cutting-edge evolutions of Coq remains
one of the prospective aspects of our project.
A second need is to help users interactively
manipulate large expressions occurring in their conjectures, an objective
for which little has been done so far. To address this need,
we work on improving the ergonomics of the system
in two ways: first, ameliorating the reactivity of Coq in its interaction
with the user; second, designing and implementing extensions of its
interface to ease our formalization activity. We expect the outcome of
these lines of research to be useful to a wider audience, interested in
manipulating large formulas on topics possibly unrelated to special functions.

Our algorithm certifications inside Coq intend to simulate
well-identified components of our Maple packages, possibly by
reproducing them in Coq. It would however not have been judicious to
re-implement them inside Coq in a systematic way. Indeed for a number of its
components, the output of the algorithm is more easily checked than
found, like for instance the solving of a linear system.
Rather, we delegate the discovery of the solutions to an
external, untrusted oracle like Maple. Trusted computations inside
Coq then formally validate the correctness of the a priori
untrusted output. More often than not, this validation consists in
implementing and executing normalization procedures inside
Coq. A challenge of this automation is to make sure they go to scale
while remaining efficient, which requires a Coq version of
non-trivial computer-algebra algorithms. A first, archetypal example we expect to
work on is a non-commutative generalization of the normalization
procedure for elements of rings 63.

Generally speaking, we design algorithms for manipulating special functions symbolically, whether univariate or with parameters, and for extracting algorithmically any kind of algebraic and analytic information from them, notably asymptotic properties. Beyond this, the heart of our research is concerned with parametrised definite summations and integrations. These very expressive operations have far-ranging applications, for instance, to the computation of integral transforms (Laplace, Fourier) or to the solution of combinatorial problems expressed via integrals (coefficient extractions, diagonals). The algorithms that we design for them need to really operate on the level of linear functional systems, differential and of recurrence. In all cases, we strive to design our algorithms with the constant goal of good theoretical complexity, and we observe that our algorithms are also fast in practice.

Our long-term goal is to design fast algorithms for a general method
for special-function integration (creative telescoping), and
make them applicable to general special-function inputs. Still, our
strategy is to proceed with simpler, more specific classes first
(rational functions, then algebraic functions, hyperexponential
functions, D-finite functions, non-D-finite functions; two variables,
then many variables); as well, we isolate analytic questions by
first considering types of integration with a more purely algebraic
flavor (constant terms, algebraic residues, diagonals of
combinatorics). In particular, we expect to extend our recent
approach 46 to more general classes
(algebraic with nested radicals, for example): the idea is to speed up
calculations by making use of an analogue of Hermite reduction that avoids
considering certificates.
Homologous problems for summation will be addressed as well.

As a consequence of our complexity-driven approach to algorithms design, the algorithms mentioned in the previous paragraph are of good complexity. Therefore, they naturally help us deal with applications that involve equations of high orders and large sizes.

With regard to combinatorics, we expect to advance the algorithmic classification of combinatorial classes like walks and urns. Here, the goal is to determine if enumerative generating functions are rational, algebraic, or D-finite, for example. Physical problems whose modelling involves special-function integrals comprise the study of models of statistical mechanics, like the Ising model for ferro-magnetism, or questions related to Hamiltonian systems.

Number theory is another promising domain of applications. Here, we attempt an experimental approach to the automated certification of integrality of the coefficients of mirror maps for Calabi–Yau manifolds. This could also involve the discovery of new Calabi–Yau operators and the certification of the existing ones. We also plan to algorithmically discover and certify new recurrences yielding good approximants needed in irrationality proofs.

It is to be noted that in all of these application domains, we would so far use general algorithms, as was done in earlier works of ours 45, 49, 48. To push the scale of applications further, we plan to consider in each case the specifics of the application domain to tailor our algorithms.

In continuation of our past project of an encyclopedia at
http://

Computer algebra manipulates symbolic representations of exact mathematical objects in a computer, in order to perform computations and operations like simplifying expressions and solving equations for “closed-form expressions”. The manipulations are often fundamentally of algebraic nature, even when the ultimate goal is analytic. The issue of efficiency is a particular one in computer algebra, owing to the extreme swell of the intermediate values during calculations.

Our view on the domain is that research on the algorithmic manipulation of special functions is anchored between two paradigms:

It aims at four kinds of algorithmic goals:

This interacts with three domains of research:

This view is made explicit in the present section.

Numerous special functions satisfy linear differential and/or
recurrence equations. Under a mild technical condition, the existence
of such equations induces a finiteness property that makes the main
properties of the functions decidable. We thus speak of
D-finite functions. For example, 60 % of the chapters in the
handbook 37 describe D-finite functions.
In addition, the class is closed under a rich set of algebraic operations.
This makes linear functional equations just the right data structure
to encode and manipulate special functions. The power of this
representation was observed in the early
1990s 87, leading to the design of many
algorithms in computer algebra.
Both on the theoretical and algorithmic sides, the study of D-finite
functions shares much with neighbouring mathematical domains:
differential algebra,
D-module theory,
differential Galois theory,
as well as their counterparts for recurrence equations.

Differential/recurrence equations that define special functions can be
recombined 87 to define: additions and
products of special functions; compositions of special functions;
integrals and sums involving special functions. Zeilberger's fast
algorithm for obtaining recurrences satisfied by parametrised binomial
sums was developed in the early 1990s already 89.
It is the basis of all modern definite summation and integration
algorithms. The theory was made fully rigorous and algorithmic in
later works, mostly by a group in Risc (Linz, Austria) and by members
of the
team 77, 85, 52, 50, 51, 71.
The past ÉPI Algorithms contributed several implementations
(gfun80,
Mgfun52).

Encoding special functions as defining linear functional equations postpones some of the difficulty of the problems to a delayed solving of equations. But at the same time, solving (for special classes of functions) is a sub-task of many algorithms on special functions, especially so when solving in terms of polynomial or rational functions. A lot of work has been done in this direction in the 1990s; more intensively since the 2000s, solving differential and recurrence equations in terms of special functions has also been investigated.

A major conceptual and algorithmic difference exists for numerical
calculations between data structures that fit on a machine word and
data structures of arbitrary length, that is, multi-precision
arithmetic. When multi-precision floating-point numbers became
available, early works on the evaluation of special functions were
just promising that “most” digits in the output were correct, and
performed by heuristically increasing precision during intermediate
calculations, without intended rigour. The original theory
has evolved in a
twofold way since the 1990s:
by making computable all constants hidden in asymptotic
approximations, it became possible to guarantee a prescribed
absolute precision; by employing state-of-the-art algorithms on
polynomials, matrices, etc, it became possible to have evaluation
algorithms in a time complexity that is linear in the output size, with a
constant that is not more than a few units.
On the implementation side, several original works
exist, one of which (NumGfun75) is
used in our DDMF.

“Differential approximation”, or “Guessing”, is an operation to get an ODE likely to be satisfied by a given approximate series expansion of an unknown function. This has been used at least since the 1970s and is a key stone in spectacular applications in experimental mathematics 49. All this is based on subtle algorithms for Hermite–Padé approximants 39. Moreover, guessing can at times be complemented by proven quantitative results that turn the heuristics into an algorithm 47. This is a promising algorithmic approach that deserves more attention than it has received so far.

The main concern of computer algebra has long been to prove the feasibility of a given problem, that is, to show the existence of an algorithmic solution for it. However, with the advent of faster and faster computers, complexity results have ceased to be of theoretical interest only. Nowadays, a large track of works in computer algebra is interested in developing fast algorithms, with time complexity as close as possible to linear in their output size. After most of the more pervasive objects like integers, polynomials, and matrices have been endowed with fast algorithms for the main operations on them 90, the community, including ourselves, started to turn its attention to differential and recurrence objects in the 2000s. The subject is still not as developed as in the commutative case, and a major challenge remains to understand the combinatorics behind summation and integration. On the methodological side, several paradigms occur repeatedly in fast algorithms: “divide and conquer” to balance calculations, “evaluation and interpolation” to avoid intermediate swell of data, etc. 44.

Handbooks collecting mathematical properties aim at serving as reference, therefore trusted, documents. The decision of several authors or maintainers of such knowledge bases to move from paper books 37, 76, 81 to websites and wikis 7 allows for a more collaborative effort in proof reading. Another step toward further confidence is to manage to generate the content of an encyclopedia by computer-algebra programs, as is the case with the Wolfram Functions Site 8 or DDMF 9. Yet, due to the lingering doubts about computer-algebra systems, some encyclopedias propose both cross-checking by different systems and handwritten companion paper proofs of their content 10. As of today, there is no encyclopedia certified with formal proofs.

Several attempts have been made in order to extend existing computer-algebra systems with symbolic manipulations of logical formulas. Yet, these works are more about extending the expressivity of computer-algebra systems than about improving the standards of correctness and semantics of the systems. Conversely, several projects have addressed the communication of a proof system with a computer-algebra system, resulting in an increased automation available in the proof system, to the price of the uncertainty of the computations performed by this oracle.

More ambitious projects have tried to design a new computer-algebra system providing an environment where the user could both program efficiently and elaborate formal and machine-checked proofs of correctness, by calling a general-purpose proof assistant like the Coq system. This approach requires a huge manpower and a daunting effort in order to re-implement a complete computer-algebra system, as well as the libraries of formal mathematics required by such formal proofs.

The move to machine-checked proofs of the mathematical correctness of the output of computer-algebra implementations demands a prior clarification about the often implicit assumptions on which the presumably correctly implemented algorithms rely. Interestingly, this preliminary work, which could be considered as independent from a formal certification project, is seldom precise or even available in the literature.

A number of authors have investigated ways to organize the communication of a chosen computer-algebra system with a chosen proof assistant in order to certify specific components of the computer-algebra systems, experimenting various combinations of systems and various formats for mathematical exchanges. Another line of research consists in the implementation and certification of computer-algebra algorithms inside the logic 84, 63, 72 or as a proof-automation strategy. Normalization algorithms are of special interest when they allow to check results possibly obtained by an external computer-algebra oracle 56. A discussion about the systematic separation of the search for a solution and the checking of the solution is already clearly outlined in 69.

Significant progress has been made in the certification of numerical applications by formal proofs. Libraries formalizing and implementing floating-point arithmetic as well as large numbers and arbitrary-precision arithmetic are available. These libraries are used to certify floating-point programs, implementations of mathematical functions and for applications like hybrid systems.

To be checked by a machine, a proof needs to be expressed in a constrained, relatively simple formal language. Proof assistants provide facilities to write proofs in such languages. But, as merely writing, even in a formal language, does not constitute a formal proof just per se, proof assistants also provide a proof checker: a small and well-understood piece of software in charge of verifying the correctness of arbitrarily large proofs. The gap between the low-level formal language a machine can check and the sophistication of an average page of mathematics is conspicuous and unavoidable. Proof assistants try to bridge this gap by offering facilities, like notations or automation, to support convenient formalization methodologies. Indeed, many aspects, from the logical foundation to the user interface, play an important role in the feasibility of formalized mathematics inside a proof assistant.

While many logical foundations for mathematics have been proposed, studied, and implemented, type theory is the one that has been more successfully employed to formalize mathematics, to the notable exception of the Mizar system 73, which is based on set theory. In particular, the calculus of construction (CoC) 54 and its extension with inductive types (CIC) 55, have been studied for more than 20 years and been implemented by several independent tools (like Lego, Matita, and Agda). Its reference implementation, Coq 83, has been used for several large-scale formalizations projects (formal certification of a compiler back-end; four-color theorem). Improving the type theory underlying the Coq system remains an active area of research. Other systems based on different type theories do exist and, whilst being more oriented toward software verification, have been also used to verify results of mainstream mathematics (prime-number theorem; Kepler conjecture).

The most distinguishing feature of CoC is that computation is promoted to the status of rigorous logical argument. Moreover, in its extension CIC, we can recognize the key ingredients of a functional programming language like inductive types, pattern matching, and recursive functions. Indeed, one can program effectively inside tools based on CIC like Coq. This possibility has paved the way to many effective formalization techniques that were essential to the most impressive formalizations made in CIC.

Another milestone in the promotion of the computations-as-proofs feature of Coq has been the integration of compilation techniques in the system to speed up evaluation. Coq can now run realistic programs in the logic, and hence easily incorporates calculations into proofs that demand heavy computational steps.

Because of their different choice for the underlying logic, other proof assistants have to simulate computations outside the formal system, and indeed fewer attempts to formalize mathematical proofs involving heavy calculations have been made in these tools. The only notable exception, which was finished in 2014, the Kepler conjecture, required a significant work to optimize the rewriting engine that simulates evaluation in Isabelle/HOL.

Programs run and proved correct inside the logic are especially useful for the conception of automated decision procedures. To this end, inductive types are used as an internal language for the description of mathematical objects by their syntax, thus enabling programs to reason and compute by case analysis and recursion on symbolic expressions.

The output of complex and optimized programs external
to the proof assistant can also be stamped with a formal proof of
correctness when their result is easier to check than to
find. In that case one can benefit from their efficiency
without compromising the level of confidence on their output at the
price of writing and certify a
checker inside the logic. This approach, which has been successfully
used in various contexts,
is very relevant to the present research project.

Representing abstract algebra in a proof assistant has been studied for long. The libraries developed by the MathComp project for the proof of the Odd Order Theorem provide a rather comprehensive hierarchy of structures; however, they originally feature a large number of instances of structures that they need to organize. On the methodological side, this hierarchy is an incarnation of an original work 58 based on various mechanisms, primarily type inference, typically employed in the area of programming languages. A large amount of information that is implicit in handwritten proofs, and that must become explicit at formalization time, can be systematically recovered following this methodology.

Small-scale reflection 60
is another methodology promoted by the MathComp project.
Its ultimate goal is to ease formal proofs by systematically
dealing with as many bureaucratic steps as possible,
by automated computation.
For instance, as opposed to the style advocated by Coq's standard
library, decidable predicates are systematically represented
using computable boolean functions: comparison on integers
is expressed as program, and to state that

The MathComp library was consistently designed after uniform principles of software engineering. These principles range from simple ones, like naming conventions, to more advanced ones, like generic programming, resulting in a robust and reusable collection of formal mathematical components. This large body of formalized mathematics covers a broad panel of algebraic theories, including of course advanced topics of finite group theory, but also linear algebra, commutative algebra, Galois theory, and representation theory. We refer the interested reader to the online documentation of these libraries 82, which represent about 150,000 lines of code and include roughly 4,000 definitions and 13,000 theorems.

Topics not addressed by these libraries and that might be relevant to the present project include real analysis and differential equations. The most advanced work of formalization on these domains is available in the HOL-Light system 65, 67, 68, although some existing developments of interest 42, 74 are also available for Coq. Another aspect of the MathComp libraries that needs improvement, owing to the size of the data we manipulate, is the connection with efficient data structures and implementations, which only starts to be explored.

The user of a proof assistant describes the proof he wants to formalize in the system using a textual language. Depending on the peculiarities of the formal system and the applicative domain, different proof languages have been developed. Some proof assistants promote the use of a declarative language, when the Coq and Matita systems are more oriented toward a procedural style.

The development of the large, consistent body of MathComp libraries has prompted the need to design an alternative and coherent language extension for the Coq proof assistant 62, 61, enforcing the robustness of proof scripts to the numerous changes induced by code refactoring and enhancing the support for the methodology of small-scale reflection.

The development of large libraries is quite a novelty for the Coq system. In particular any long-term development process requires the iteration of many refactoring steps and very little support is provided by most proof assistants, with the notable exception of Mizar 79. For the Coq system, this is an active area of research.

Our expertise in computer algebra and complexity-driven design of algebraic algorithms has applications in various domains, including:

In 24 Alin Bostan and Ryuhei Mori (Tokyo
Institute of Technology, Japan) designed a simple and fast algorithm for
computing the

In 1977, Strassen invented a famous baby-step/giant-step algorithm that computes the factorial

A previous article by Alin Bostan and colleagues last year described explicit expressions for the coefficients of the
order-linear arithmetic complexity, which is
faster than for arbitrary polynomials. The result is obtained as a
consequence of the amazing though seemingly unnoticed fact that these
subresultants are scalar multiples of Jacobi polynomials up to an affine
change of variables.

In 11, Alin Bostan together with Vladica Andrejić
(University of Belgrade, Serbia) and Milos Tatarevic (CoinList, Alameda, CA)
presented improved algorithms for computing the left factorial residues

If a linear differential operator with rational function coefficients is
reducible, its factors may have coefficients with numerators and denominators
of very high degree. When the base field is

With Peter Bürgisser (Technische Universität Berlin, Germany) and Felipe Cucker (City University of Hong Kong), Pierre Lairez has developed an algorithm to compute an approximate zero of a polynomial system given as a black-box evaluation function. Based on this, they study the average complexity of solving polynomial systems with low evaluation complexity.

Previous average complexity analyses of numerical algorithms to solve polynomial systems assume a dense model of random polynomials, far from the applications. In the work 31, Pierre Lairez and his colleagues deal with a model of random polynomials (random algebraic branching programs, ABP) indexed by the evaluation complexity and coming from a universal model of computation in algebraic complexity theory. They show that their algorithm performs

Given two numbers

Divide-and-conquer recurrences relate values of a given sequence at indices of the form

With Antonio Jiménez-Pastor who was visiting the team during his doctoral preparation, Alin Bostan, Frédéric Chyzak, and Pierre Lairez worked on a new software library designed to work with generating functions that count walks in the quarter plane. With this library for the Sagemath system they offer a cohesive package that brings together all the required procedures for manipulating these generating functions, as well as a unified interface to deal with them. They also display on a public webpage results that this package computes. They presented their work as an extended abstract at the conference ISSAC 14.

A power series is called differentially transcendent if it does not
satisfy any algebraic differential equation. In 2003, Martin Klazar proved in
an elementary but very clever way that the ordinary generating function of the
famous combinatorial Bell numbers, counting partitions of sets, is
differentially transcendent. In 27, Alin Bostan (of
the team), together with Lucia Di Vizio (CNRS, Université de Versailles) and
Kilian Raschel (CNRS, Université de Tours), showed that Klazar's result is
an instance of a general phenomenon that can be proven in a compact way using
difference Galois theory. In the paper, they present the main
principles of this theory in order to prove a general result of differential
transcendence, that they apply to many other (infinite classes of) examples of
generating functions, including as very special cases the ones considered
by Klazar. Most of their examples belong to Sheffer's class, well studied
notably in umbral calculus. They all bring concrete evidence in support to the
Pak-Yeliussizov conjecture, according to which a sequence whose both ordinary
and exponential generating functions satisfy nonlinear differential equations
with polynomial coefficients necessarily satisfies a linear recurrence
with polynomial coefficients.

In 16, Alin Bostan and Antonio
Jiménez-Pastor (U. Linz, Austria), proved that the exponential generating
function of labelled trees,

Frédéric Chyzak and Karen Yeats (University of Waterloo, Canada) have studied the enumeration by length of several walk models on the square lattice. In the work 32 published this year, they obtain bijections between walks in the upper half-plane returning to the

In 22 Alin Bostan together with Arnaud Carayol, Florent Koechlin and Cyril Nicaud (Univ. Marne-la-Vallée, France) investigated the connection between properties of formal languages and properties of their generating series, with a focus on the class of holonomic power series. They first proved a strong version of a conjecture by Castiglione and Massazza: weakly-unambiguous Parikh automata are equivalent to unambiguous two-way reversal bounded counter machines, and their multivariate generating series are holonomic. They then show that the converse is not true: they construct a language whose generating series is algebraic (thus holonomic), but which is inherently weakly-ambiguous as a Parikh automata language. Finally, they prove an effective decidability result for the inclusion problem for weakly-unambiguous Parikh automata, and provide an upper-bound on its complexity.

In the past fifteen years, the enumeration of lattice walks with steps taken
in a prescribed set and confined to a given cone, especially the first
quadrant of the plane, has been intensely studied. As a result, the generating
functions of quadrant walks are now well-understood, provided the allowed
steps are small. In particular, having small steps is crucial for the
definition of a certain group of bi-rational transformations of the plane. It
has been proved that this group is finite if and only if the corresponding
generating function is D-finite. This group is also the key to the uniform
solution of 19 of the 23 small step models possessing a finite group. In
contrast, almost nothing was known for walks with arbitrary steps.
In 12, Alin Bostan together with Mireille
Bousquet-Mélou (CNRS, Bordeaux) and Stephen Melczer (U. Pennsylvania,
Philadelphia, USA), extended the definition of the group, or rather of the
associated orbit, to this general case, and generalized the above uniform
solution of small step models. When this approach works, it invariably yields
a D-finite generating function. They applied it to many quadrant problems,
including some infinite families.
After developing the general theory, the authors of 12
considered the

Beaton, Owczarek and Xu (2019) studied generating functions of Kreweras walks
and of reverse Kreweras walks in the quarter plane, with interacting
boundaries. They proved that for the reverse Kreweras step set, the generating
function is always algebraic, and for the Kreweras step set, the generating
function is always D-finite. However, apart from the particular case where the
interactions are symmetric in

Alin Bostan contributed to an article by C. Boutillier (Sorbonne Université) and
K. Raschel (CNRS, Université de Tours)
19, devoted to the study of random walks on
isoradial graphs. Contrary to the lattice case, isoradial graphs are not
translation invariant, do not admit any group structure and are spatially
non-homogeneous. However, Boutillier and Raschel have been able to obtain
analogues of a celebrated result by Ney and Spitzer (1966) on the so-called
Martin kernel (ratio of Green functions started at different points).
Alin Bostan provided in the Appendix two different proofs of the fact that
some algebraic power series arising in this context have non-negative
coefficients.

A small subset of combinatorial sequences have coefficients that can be
represented as moments of a nonnegative measure on Stieltjes moment sequences. They have a number
of useful properties, such as log-convexity, which in turn enables one to
rigorously bound their growth constant from below.

In 15, Alin Bostan together with Andrew Elvey Price (Université de Bordeaux),
Anthony Guttmann (University of Melbourne), and Jean-Marie Maillard (Sorbonne Université), studied some classical sequences in
enumerative combinatorics, denoted

They showed that the densities for

As a bonus, they studied the challenging case of the

Alin Bostan contributed to F. Chapoton's article 20 by
writing an appendix, which allowed the author to complete its article. The
theme of 20 is the study of simplicial complexes in
algebraic combinatorics. A basic invariant is the

In 13, Alin Bostan together with Fernando Chamizo (Universidad Autónoma de Madrid and ICMAT, Spain) and Mikael Persson Sundqvist (Lund University, Sweden) gave three elementary proofs of a nice equality of definite integrals, recently proven by Ekhad, Zeilberger and Zudilin. The equality arises in the theory of bivariate hypergeometric functions, and has connections with irrationality proofs in number theory. They furthermore provide a generalization together with an equally elementary proof and discuss some consequences.

There are many viewpoints on algebraic power series, ranging from the abstract ring-theoretic notion of Henselization to the very explicit perspective as diagonals of certain rational functions. Denef and Lipshitz proved in 1987 that any algebraic power series in

In 30, Alin Bostan together with his PhD student Sergey Yurkevich proved that the diagonal of any finite product of algebraic functions of the form

is a generalized hypergeometric function, and they provided explicit description of its parameters. The particular case

In collaboration with R. Iasnogorodski (SPCPA, Saint-Petersburg), Guy Fayolle analyzes the kernelsingular or regular, as defined in 57. These conditions are independent of step set configurations. They also find the configurations for the kernel to be of genus 0, knowing that the genus is always

In an ongoing work in collaboration with S. Franceschi (LMO, Paris-Saclay University) and K. Raschel (CNRS, Tours University), Guy Fayolle states a system of functional equations satisfied by the Laplace transform of the stationary distribution of a reflected Brownian motion (SRBM) in a two-dimensional non-convex cone. While the case of convex cones is now reasonably well studied, the framework of non-convex cones turns out to be more challenging, as shown by similar research carried out in a discrete setting. They show in particular that the problem can be reduced to a boundary value problem of Rieman–Hilbert–Carleman type on an hyperbola, for a two-dimensional vector of meromorphic functions. This seems to be a quite original result.

In the second edition of the book 57, original methods were proposed to
determine the invariant measure of random walks in the quarter plane with small jumps (size
1), the general solution being obtained via reduction to boundary value problems. Among other
things, an important quantity, the so-called group of the walk, allows to deduce theoretical
features about the nature of the solutions. In particular, when the order of the group is
finite and the underlying algebraic curve is of genus 0 or 1, necessary and sufficient
conditions have been given for the solution to be rational, algebraic or Jackson networks) and explicit solutions of functional equations for counting lattice walks.
Some partial extensions of 33 are under development.

TLA+ originally allowed recursive function definitions, but not recursive operator definitions, because it was not known how how to define their semantics. They were added to the language in 2006 after a semantics was discovered for them. This year, Georges Gonthier, together with Leslie Lamport (Microsoft Research, USA), described that semantics in 34.