• The Inria's Research Teams produce an annual Activity Report presenting their activities and their results of the year. These reports include the team members, the scientific program, the software developed by the team and the new results of the year. The report also describes the grants, contracts and the activities of dissemination and teaching. Finally, the report gives the list of publications of the year.

• Legal notice
• Personal data

Section: New Results

Graph Algorithms

Participants : Julien Bensmail, Jean-Claude Bermond, Nathann Cohen, David Coudert, Frédéric Giroire, Frédéric Havet, Fionn Mc Inerney, Nicolas Nisse, Stéphane Pérennes.

Coati is interested in the algorithmic aspects of Graph Theory. In general we try to find the most efficient algorithms to solve various problems of Graph Theory and telecommunication networks. We use Graph Theory to model various network problems. We study their complexity and then we investigate the structural properties of graphs that make these problems hard or easy.

Complexity of graph problems

Parameterized complexity of polynomial optimization problems (FPT in P)

Parameterized complexity theory has enabled a refined classification of the difficulty of NP-hard optimization problems on graphs with respect to key structural properties, and so to a better understanding of their true difficulties. More recently, hardness results for problems in P were established under reasonable complexity theoretic assumptions such as: Strong Exponential Time Hypothesis (SETH), 3SUM and All-Pairs Shortest-Paths (APSP). According to these assumptions, many graph theoretic problems do not admit truly subquadratic algorithms, nor even truly subcubic algorithms (Williams and Williams, FOCS 2010 [83] and Abboud et al. SODA 2015 [67]). A central technique used to tackle the difficulty of the above mentioned problems is fixed-parameter algorithms for polynomial-time problems with polynomial dependency in the fixed parameter (P-FPT). This technique was rigorously formalized by Giannopoulou et al. (IPEC 2015) [74], [75]. Following that, it was continued by Abboud et al. (SODA 2016) [68], by Husfeldt (IPEC 2016) [76] and Fomin et al. (SODA 2017) [73], using the treewidth as a parameter. Applying this technique to clique-width, another important graph parameter, remained to be done.

In [45] we study several graph theoretic problems for which hardness results exist such as cycle problems (triangle detection, triangle counting, girth), distance problems (diameter, eccentricities, Gromov hyperbolicity, betweenness centrality) and maximum matching. We provide hardness results and fully polynomial FPT algorithms, using clique-width and some of its upper-bounds as parameters (split-width, modular-width and ${P}_{4}$-sparseness). We believe that our most important result is an $𝒪\left({k}^{4}·n+m\right)$-time algorithm for computing a maximum matching where $k$ is either the modular-width or the ${P}_{4}$-sparseness. The latter generalizes many algorithms that have been introduced so far for specific subclasses such as cographs, ${P}_{4}$-lite graphs, ${P}_{4}$-extendible graphs and ${P}_{4}$-tidy graphs. Our algorithms are based on preprocessing methods using modular decomposition, split decomposition and primeval decomposition. Thus they can also be generalized to some graph classes with unbounded clique-width.

Revisiting Decomposition by Clique Separators

We study in [26] the complexity of decomposing a graph by means of clique separators. This common algorithmic tool, first introduced by Tarjan  [79], allows to cut a graph into smaller pieces, and so, it can be applied to preprocess the graph in the computation of optimization problems. However, the best-known algorithms for computing a decomposition have respective $𝒪\left(nm\right)$-time and $𝒪\left({n}^{\left(3+\alpha \right)/2}\right)=o\left({n}^{2.69}\right)$-time complexity, with $\alpha <2.3729$ being the exponent for matrix multiplication. Such running times are prohibitive for large graphs. In [26], we prove that for every graph $G$, a decomposition can be computed in $𝒪\left(T\left(G\right)+min\left\{{n}^{\alpha },{\omega }^{2}n\right\}\right)$-time with $T\left(G\right)$ and $\omega$ being respectively the time needed to compute a minimal triangulation of $G$ and the clique-number of $G$. In particular, it implies that every graph can be decomposed by clique separators in $𝒪\left({n}^{\alpha }logn\right)$-time. Based on prior work from Kratsch and Spinrad  [77], we prove in addition that decomposing a graph by clique-separators is as least as hard as triangle detection. Therefore, the existence of any $o\left({n}^{\alpha }\right)$-time algorithm for this problem would be a significant breakthrough in the field of algorithmic. Finally, our main result implies that planar graphs, bounded-treewidth graphs and bounded-degree graphs can be decomposed by clique separators in linear or quasi-linear time.

Distance-preserving elimination orderings in graphs

For every connected graph $G$, a subgraph $H$ of $G$ is isometric if the distance between any two vertices in $H$ is the same in $H$ as in $G$. A distance-preserving elimination ordering of $G$ is a total ordering of its vertex-set $V\left(G\right)$, denoted $\left({v}_{1},{v}_{2},...,{v}_{n}\right)$, such that any subgraph ${G}_{i}=G\setminus \left({v}_{1},{v}_{2},...,{v}_{i}\right)$ with $1\le i is isometric. This kind of ordering has been introduced by Chepoi in his study on weakly modular graphs  [71]. In [27], we prove that it is NP-complete to decide whether such ordering exists for a given graph | even if it has diameter at most 2. Then, we prove on the positive side that the problem of computing a distance-preserving ordering when there exists one is fixed-parameter-tractable in the treewidth. Lastly, we describe a heuristic in order to compute a distance-preserving ordering when there exists one that we compare to an exact exponential time algorithm and to an ILP formulation for the problem.

Complexity of computing strong pathbreadth

The strong pathbreadth of a given graph $G$ is the minimum $\rho$ such that $G$ admits a Robertson and Seymour's path decomposition where every bag is the complete $\rho$-neighbourhood of some vertex in G. In [29] (Work done while G. Ducoffe was a member of Coati and published this year.), we prove that deciding whether a given graph has strong pathbreadth at most one is NP-complete. This answers negatively to a conjecture of Leitert and Dragan [78].

Improving matchings in trees, via bounded-length augmentations

In [13] Due to a classical result of Berge, it is known that a matching of any graph can be turned into a maximum matching by repeatedly augmenting alternating paths whose ends are not covered. In a recent work, Nisse, Salch and Weber considered the influence, on this process, of augmenting paths with length at most $k$ only. Given a graph $G$, an initial matching $M\subseteq E\left(G\right)$ and an odd integer $k$, the problem is to find a longest sequence of augmenting paths of length at most $k$ that can be augmented sequentially from $M$. They proved that, when only paths of length at most $k=3$ can be augmented, computing such a longest sequence can be done in polynomial time for any graph, while the same problem for any $k\ge 5$ is NP-hard. Although the latter result remains true for bipartite graphs, the status of the complexity of the same problem for trees is not known.

This work is dedicated to the complexity of this problem for trees. On the positive side, we first show that it can be solved in polynomial time for more classes of trees, namely bounded-degree trees (via a dynamic programming approach), caterpillars and trees where the nodes with degree at least 3 are sufficiently far apart. On the negative side, we show that, when only paths of length exactly $k$ can be augmented, the problem becomes NP-hard already for $k=3$, in the class of planar bipartite graphs with maximum degree 3 and arbitrary large girth. We also show that the latter problem is NP-hard in trees when $k$ is part of the input.

Dynamics of formation of communities in social networks

We consider in [40] a community formation problem in social networks, where the users are either friends or enemies. The users are partitioned into conflict-free groups (i.e., independent sets in the conflict graph ${G}^{-}=\left(V,E\right)$ that represents the enmities between users). The dynamics goes on as long as there exists any set of at most $k$ users, $k$ being any fixed parameter, that can change their current groups in the partition simultaneously, in such a way that they all strictly increase their utilities (number of friends i.e., the cardinality of their respective groups minus one). Previously, the best-known upper-bounds on the maximum time of convergence were $O\left(|V|\alpha \left({G}^{-}\right)\right)$ for $k\le 2$ and $O\left(|V{|}^{3}\right)$ for $k=3$, with $\alpha \left({G}^{-}\right)$ being the independence number of ${G}^{-}$. Our first contribution in this paper consists in reinterpreting the initial problem as the study of a dominance ordering over the vectors of integer partitions. With this approach, we obtain for $k\le 2$ the tight upper-bound $O\left(|V|min\alpha \left({G}^{-}\right),\sqrt{|V|}\right)$ and, when ${G}^{-}$ is the empty graph, the exact value of order $\frac{{\left(2|V|\right)}^{3/2}}{3}$. The time of convergence, for any fixed $k\ge 4$, was conjectured to be polynomial. In [40], we disprove this. Specifically, we prove that for any $k\ge 4$, the maximum time of convergence is an $\Omega \left(|V{|}^{\Theta \left(log|V|\right)}\right)$.

Application to bioinformatics

For a (possibly infinite) fixed family of graphs $ℱ$, we say that a graph $G$ overlays $ℱ$ on a hypergraph $H$ if $V\left(H\right)$ is equal to $V\left(G\right)$ and the subgraph of $G$ induced by every hyperedge of $H$ contains some member of $ℱ$ as a spanning subgraph. While it is easy to see that the complete graph on $|V\left(H\right)|$ overlays $ℱ$ on a hypergraph $H$ whenever the problem admits a solution, the Minimum ℱ-Overlay problem asks for such a graph with at most $k$ edges, for some given $k\in ℕ$. This problem allows to generalize some natural problems which may arise in practice. For instance, if the family $ℱ$ contains all connected graphs, then Minimum ℱ-Overlay corresponds to the Minimum Connectivity Inference problem (also known as Subset Interconnection Design problem) introduced for the low-resolution reconstruction of macro-molecular assembly in structural biology, or for the design of networks.

In [23], we prove a strong dichotomy result regarding the polynomial vs. NP-complete status with respect to the considered family $ℱ$. Roughly speaking, we show that the easy cases one can think of (e.g. when edgeless graphs of the right sizes are in $ℱ$, or if $ℱ$ contains only cliques) are the only families giving rise to a polynomial problem: all others are $\mathrm{𝒩𝒫}$-complete. We then investigate the parameterized complexity of the problem and give similar sufficient conditions on $ℱ$ that give rise to W[1]-hard, W[2]-hard or FPT problems when the parameter is the size of the solution. This yields an FPT/W[1]-hard dichotomy for a relaxed problem, where every hyperedge of $H$ must contain some member of $ℱ$ as a (non necessarily spanning) subgraph.