EN FR
EN FR


Section: New Results

Graph Algorithms

Participants : Julien Bensmail, Jean-Claude Bermond, Nathann Cohen, David Coudert, Frédéric Giroire, Frédéric Havet, Fionn Mc Inerney, Nicolas Nisse, Stéphane Pérennes.

Coati is interested in the algorithmic aspects of Graph Theory. In general we try to find the most efficient algorithms to solve various problems of Graph Theory and telecommunication networks. We use Graph Theory to model various network problems. We study their complexity and then we investigate the structural properties of graphs that make these problems hard or easy.

Complexity of graph problems

Parameterized complexity of polynomial optimization problems (FPT in P)

Parameterized complexity theory has enabled a refined classification of the difficulty of NP-hard optimization problems on graphs with respect to key structural properties, and so to a better understanding of their true difficulties. More recently, hardness results for problems in P were established under reasonable complexity theoretic assumptions such as: Strong Exponential Time Hypothesis (SETH), 3SUM and All-Pairs Shortest-Paths (APSP). According to these assumptions, many graph theoretic problems do not admit truly subquadratic algorithms, nor even truly subcubic algorithms (Williams and Williams, FOCS 2010 [83] and Abboud et al. SODA 2015 [67]). A central technique used to tackle the difficulty of the above mentioned problems is fixed-parameter algorithms for polynomial-time problems with polynomial dependency in the fixed parameter (P-FPT). This technique was rigorously formalized by Giannopoulou et al. (IPEC 2015) [74], [75]. Following that, it was continued by Abboud et al. (SODA 2016) [68], by Husfeldt (IPEC 2016) [76] and Fomin et al. (SODA 2017) [73], using the treewidth as a parameter. Applying this technique to clique-width, another important graph parameter, remained to be done.

In [45] we study several graph theoretic problems for which hardness results exist such as cycle problems (triangle detection, triangle counting, girth), distance problems (diameter, eccentricities, Gromov hyperbolicity, betweenness centrality) and maximum matching. We provide hardness results and fully polynomial FPT algorithms, using clique-width and some of its upper-bounds as parameters (split-width, modular-width and P4-sparseness). We believe that our most important result is an 𝒪(k4·n+m)-time algorithm for computing a maximum matching where k is either the modular-width or the P4-sparseness. The latter generalizes many algorithms that have been introduced so far for specific subclasses such as cographs, P4-lite graphs, P4-extendible graphs and P4-tidy graphs. Our algorithms are based on preprocessing methods using modular decomposition, split decomposition and primeval decomposition. Thus they can also be generalized to some graph classes with unbounded clique-width.

Revisiting Decomposition by Clique Separators

We study in [26] the complexity of decomposing a graph by means of clique separators. This common algorithmic tool, first introduced by Tarjan  [79], allows to cut a graph into smaller pieces, and so, it can be applied to preprocess the graph in the computation of optimization problems. However, the best-known algorithms for computing a decomposition have respective 𝒪(nm)-time and 𝒪(n(3+α)/2)=o(n2.69)-time complexity, with α<2.3729 being the exponent for matrix multiplication. Such running times are prohibitive for large graphs. In [26], we prove that for every graph G, a decomposition can be computed in 𝒪(T(G)+min{nα,ω2n})-time with T(G) and ω being respectively the time needed to compute a minimal triangulation of G and the clique-number of G. In particular, it implies that every graph can be decomposed by clique separators in 𝒪(nαlogn)-time. Based on prior work from Kratsch and Spinrad  [77], we prove in addition that decomposing a graph by clique-separators is as least as hard as triangle detection. Therefore, the existence of any o(nα)-time algorithm for this problem would be a significant breakthrough in the field of algorithmic. Finally, our main result implies that planar graphs, bounded-treewidth graphs and bounded-degree graphs can be decomposed by clique separators in linear or quasi-linear time.

Distance-preserving elimination orderings in graphs

For every connected graph G, a subgraph H of G is isometric if the distance between any two vertices in H is the same in H as in G. A distance-preserving elimination ordering of G is a total ordering of its vertex-set V(G), denoted (v1,v2,...,vn), such that any subgraph Gi=G(v1,v2,...,vi) with 1i<n is isometric. This kind of ordering has been introduced by Chepoi in his study on weakly modular graphs  [71]. In [27], we prove that it is NP-complete to decide whether such ordering exists for a given graph | even if it has diameter at most 2. Then, we prove on the positive side that the problem of computing a distance-preserving ordering when there exists one is fixed-parameter-tractable in the treewidth. Lastly, we describe a heuristic in order to compute a distance-preserving ordering when there exists one that we compare to an exact exponential time algorithm and to an ILP formulation for the problem.

Complexity of computing strong pathbreadth

The strong pathbreadth of a given graph G is the minimum ρ such that G admits a Robertson and Seymour's path decomposition where every bag is the complete ρ-neighbourhood of some vertex in G. In [29] (Work done while G. Ducoffe was a member of Coati and published this year.), we prove that deciding whether a given graph has strong pathbreadth at most one is NP-complete. This answers negatively to a conjecture of Leitert and Dragan [78].

Improving matchings in trees, via bounded-length augmentations

In [13] Due to a classical result of Berge, it is known that a matching of any graph can be turned into a maximum matching by repeatedly augmenting alternating paths whose ends are not covered. In a recent work, Nisse, Salch and Weber considered the influence, on this process, of augmenting paths with length at most k only. Given a graph G, an initial matching ME(G) and an odd integer k, the problem is to find a longest sequence of augmenting paths of length at most k that can be augmented sequentially from M. They proved that, when only paths of length at most k=3 can be augmented, computing such a longest sequence can be done in polynomial time for any graph, while the same problem for any k5 is NP-hard. Although the latter result remains true for bipartite graphs, the status of the complexity of the same problem for trees is not known.

This work is dedicated to the complexity of this problem for trees. On the positive side, we first show that it can be solved in polynomial time for more classes of trees, namely bounded-degree trees (via a dynamic programming approach), caterpillars and trees where the nodes with degree at least 3 are sufficiently far apart. On the negative side, we show that, when only paths of length exactly k can be augmented, the problem becomes NP-hard already for k=3, in the class of planar bipartite graphs with maximum degree 3 and arbitrary large girth. We also show that the latter problem is NP-hard in trees when k is part of the input.

Dynamics of formation of communities in social networks

We consider in [40] a community formation problem in social networks, where the users are either friends or enemies. The users are partitioned into conflict-free groups (i.e., independent sets in the conflict graph G-=(V,E) that represents the enmities between users). The dynamics goes on as long as there exists any set of at most k users, k being any fixed parameter, that can change their current groups in the partition simultaneously, in such a way that they all strictly increase their utilities (number of friends i.e., the cardinality of their respective groups minus one). Previously, the best-known upper-bounds on the maximum time of convergence were O(|V|α(G-)) for k2 and O(|V|3) for k=3, with α(G-) being the independence number of G-. Our first contribution in this paper consists in reinterpreting the initial problem as the study of a dominance ordering over the vectors of integer partitions. With this approach, we obtain for k2 the tight upper-bound O(|V|minα(G-),|V|) and, when G- is the empty graph, the exact value of order (2|V|)3/23. The time of convergence, for any fixed k4, was conjectured to be polynomial. In [40], we disprove this. Specifically, we prove that for any k4, the maximum time of convergence is an Ω(|V|Θ(log|V|)).

Application to bioinformatics

For a (possibly infinite) fixed family of graphs , we say that a graph G overlays on a hypergraph H if V(H) is equal to V(G) and the subgraph of G induced by every hyperedge of H contains some member of as a spanning subgraph. While it is easy to see that the complete graph on |V(H)| overlays on a hypergraph H whenever the problem admits a solution, the Minimum ℱ-Overlay problem asks for such a graph with at most k edges, for some given k. This problem allows to generalize some natural problems which may arise in practice. For instance, if the family contains all connected graphs, then Minimum ℱ-Overlay corresponds to the Minimum Connectivity Inference problem (also known as Subset Interconnection Design problem) introduced for the low-resolution reconstruction of macro-molecular assembly in structural biology, or for the design of networks.

In [23], we prove a strong dichotomy result regarding the polynomial vs. NP-complete status with respect to the considered family . Roughly speaking, we show that the easy cases one can think of (e.g. when edgeless graphs of the right sizes are in , or if contains only cliques) are the only families giving rise to a polynomial problem: all others are 𝒩𝒫-complete. We then investigate the parameterized complexity of the problem and give similar sufficient conditions on that give rise to W[1]-hard, W[2]-hard or FPT problems when the parameter is the size of the solution. This yields an FPT/W[1]-hard dichotomy for a relaxed problem, where every hyperedge of H must contain some member of as a (non necessarily spanning) subgraph.