Section: New Results

Computability and Complexity

  • Genericity of semi-computable objects. One of the main goals of computability theory is to understand and classify the algorithmic content of infinite objects, which can be expressed as the difficulty of computing them or as their ability to help solving problems. In establishing this classification one is often led to separate classes of algorithmic complexity and the construction of counter-examples is usually a hard task that requires the use of advanced technics (among which the so-called priority method with finite injury). The difficulty in such a construction is that the constructed object should satisfy two types of requirements going in opposite directions: it should lack algorithmic content but at the same time should be constructible in some way. In other words, these objects live somewhere between generic objects (objects with no structure) and computable objects (the most constructible objects). While computability theory provides formal notions of genericity, these ones are always incompatible with computability.

    We introduce a new notion of genericity which has two advantages: it is close to plain genericity, and we prove that it is compatible with semi-computability (for a property, being semi-decidable is a semi-computability notion while being decidable is a plain computability notion). The latter result has important consequences: many ad hoc existing constructions are subsumed by this result and then unified, new results can be obtained whenever the new notion of genericity captures the sought properties, and the result clarifies the role of topology in computability theory.

    This work is the sequel of the STACS 2013 paper [19] and is currently submitted [26] .

  • Analytical properties of resource-bounded real functionals. In [14] Hugo Férée, Walid Gomaa and Mathieu Hoyrup extend the results of [52] to non-deterministic complexity. More precisely, we introduce the analytical concepts of essential point and sufficient set for norms over continuous functions and use them to characterize the class of norms that are computable in non-deterministic polynomial time.

  • Call-by-value, call-by-name and the vectorial behaviour of the algebraic λ-calculus. In this article published in LMCS (Logical Methods in Computer Science) [12] , Ali Assaf, Alejandro Díaz-Caro, Simon Perdrix, Christine Tasson and Benoît Valiron examine the relationship between the algebraic lambda-calculus, a fragment of the differential lambda-calculus and the linear-algebraic lambda-calculus, a candidate lambda-calculus for quantum computation. Both calculi are algebraic: each one is equipped with an additive and a scalar-multiplicative structure, and their set of terms is closed under linear combinations. However, the two languages were built using different approaches: the former is a call-by-name language whereas the latter is call-by-value; the former considers algebraic equalities whereas the latter approaches them through rewrite rules. In this paper, they analyse how these different approaches relate to one another. To this end, four canonical languages based on each of the possible choices are proposed: call-by-name versus call-by-value, algebraic equality versus algebraic rewriting. The various languages are simulating each other. Due to subtle interaction between beta-reduction and algebraic rewriting, to make the languages consistent some additional hypotheses such as confluence or normalisation might be required.

  • Real or Natural numbers interpretations and their effect on complexity. Guillaume Bonfante, Florian Deloup and Antoine Henrot [13] have shown how deep results in algebraic geometry may be read in a complexity perspective. They show that real numbers though they are not well founded can be used as natural numbers are for program interpretations. The argument is based on Positivstellensatz, a major result proved by Stengle.

  • Information carried by programs about the objects they compute. In computability theory and computable analysis, finite programs can compute infinite objects. Presenting a computable object via any program for it, provides at least as much information as presenting the object itself, written on an infinite tape. What additional information do programs provide? We characterize this additional information to be any upper bound on the Kolmogorov complexity of the object, i.e., it gives an upper bound on size of a shortest program computing the object.

    This problem can be formalized using the two classical models of computation of Markov-computability [61] and Type-2 computability [74] , which are the most famous and studied ways of computing with infinite objects. Many celebrated results comparing these models have been developed in the 50's (theorems by Rice, Rice-Shapiro, Kreisel-Lacombe-Schoenfiled/Ceitin, Friedberg) but a complete understanding of their precise relationship has never been obtained. Our results fill this void, identifying the exact relationship between the two models. In particular this relationship enables us to obtain several results characterizing the computational and topological structure of Markov-semidecidable properties.

    This work, made in collaboration with Cristóbal Rojas (Santiago) during his visit as an Inria “Chercheur Invité”, has been accepted in STACS 2015 [20] .

  • Causal Graph Dynamics. Causal Graph Dynamics extend Cellular Automata to arbitrary, bounded-degree, time-varying graphs. The whole graph evolves in discrete time steps, and this global evolution is required to have a number of physics-like symmetries: shift-invariance (it acts everywhere the same) and causality (information has a bounded speed of propagation). Pablo Arrighi, Emmanuel Jeandel, Simon Martiel (I3S, Univ. Nice-Sophia Antipolis), and Simon Perdrix are investigating the properties of this model. In particular a work on the reversibility of causal graph dynamics has just been submitted in January 2015.

  • The Parameterized Complexity of Domination-type Problems and Application to Linear Codes. In this article presented at TAMC'14 (Theory and Applications of Models of Computation) [17] , David Cattanéo and Simon Perdrix study the parameterized complexity of domination-type problems. (σ,ρ)-domination is a general and unifying framework introduced by Telle: given σ,ρ, a set D of vertices of a graph G is (σ,ρ)-dominating if for any vD, |N(v)D|σ and for any vD,|N(v)D|ρ. The main result is that for any σ and ρ recursive sets, deciding whether there exists a (σ,ρ)-dominating set of size k, or of size at most k, are both in W[2]. This general statement is optimal in the sense that several particular instances of (σ,ρ)-domination are W[2]-complete (e.g., Dominating Set ). This result is also extended to a class of domination-type problems which do not fall into the (σ,ρ)-domination framework, including Connected Dominating Set and the problem of the minimal distance of a linear code over a finite field.

    To prove the W[2]-membership of the domination-type problems the authors extend the Turing-way to parameterized complexity by introducing a new kind of non-deterministic Turing machine with the ability to perform `blind' transitions, i.e., transitions which do not depend on the content of the tapes.

  • Quantum Circuits for the Unitary Permutation Problem. In this paper [18] presented at DCM'14 (New Development in Computational models) and at the Workshop on Quantum Metrology, Interaction, and Causal Structure 2014 (invited talk), Stefano Facchni and Simon Perdrix consider the Unitary Permutation problem which consists, given n quantum gates U1,...,Un and a permutation σ of {1,...,n}, in applying the quantum gates in the order specified by σ, i.e., in performing Uσ(n)...Uσ(1).

    This problem has been introduced and investigated in [40] where two models of computations are considered. The first is the (standard) model of query complexity: the complexity measure is the number of calls to any of the quantum gates Ui in a quantum circuit which solves the problem. The second model is roughly speaking a model for higher order quantum computation, where quantum gates can be treated as objects of second order. In both model the existing bounds are improved, in particular the upper and lower bounds for the standard quantum circuit model are established by pointing out connections with the permutation as substring problem introduced by Karp.