Section: New Results

Data Structures and Robust Geometric Computation

Straight-line graph drawing on the torus

Participant : Olivier Devillers.

In collaboration with Luca Castelli Aleardi and Éric Fusy (LIX, Palaiseau).

We extend the notion of canonical orderings to cylindric triangulations. This allows us to extend the incremental straight-line drawing algorithm of de Fraysseix et al. to this setting. Our algorithm yields in linear time a crossing-free straight-line drawing of a cylindric triangulation T with n vertices on a regular grid /w×[0,h], with w2n and hn(2d+1), where d is the (graph-) distance between the two boundaries. As a by-product, we can also obtain in linear time a crossing-free straight-line drawing of a toroidal triangulation with n vertices on a periodic regular grid /w×/h, with w2n and h1+n(2c+1), where c is the length of a shortest non-contractible cycle. Since c2n, the grid area is O(n 5/2 ) [24] .

Qualitative symbolic perturbation

Participants : Olivier Devillers, Monique Teillaud.

In collaboration with Menelaos Karavelas (University of Crete).

In the literature, the generic way to address degeneracies in computational geometry is the Symbolic Perturbation paradigm: the input is made dependent of some parameter ε so that for ε positive and close to zero, the input is close to the original input, while at the same time, in non-degenerate position. A geometric predicate can usually be seen as the sign of some function of the input. In the symbolic perturbation paradigm, if the function evaluates to zero, the input is perturbed by a small positive ε, and the sign of the function evaluated at the perturbed input is used instead.

The usual way of using this approach is what we will call Algebraic Symbolic Perturbation framework. When the function to be evaluated is a polynomial of the input, its perturbed version is seen as a polynomial in ε, whose coefficients are polynomials in the input. These coefficients are evaluated by increasing degree in ε until a non-vanishing coefficient is found. The number of these coefficients can be quite large and expressing them in an easily and efficiently computable manner (e.g., factorized) may require quite some work.

We propose to address the handling of geometric degeneracies in a different way, namely by means of what we call the Qualitative Symbolic Perturbation framework. We no longer use a single perturbation that must remove all degeneracies, but rather a sequence of perturbations, such that the next perturbation is being used only if the previous ones have not removed the degeneracies. The new perturbation is considered as symbolically smaller than the previous ones. This approach allows us to use simple elementary perturbations whose effect can be analyzed and evaluated: (1) by geometric reasoning instead of algebraic development of the predicate polynomial in ε, and (2) independently of a specific algebraic formulation of the predicate.

We apply our framework to predicates used in the computation of Apollonius diagrams in 2D and 3D, as well as the computation of trapezoidal maps of circular arcs [57] .

Covering spaces and Delaunay triangulations of the 2D flat torus

Participants : Mikhail Bogdanov, Monique Teillaud.

In collaboration with Gert Vegter (Johan Bernoulli Institute, Groningen University)

A previous algorithm was computing the Delaunay triangulation of the flat torus, by using a 9-sheeted covering space [64] . We propose a modification of the algorithm using only a 8-sheeted covering space, which allows to work with 8 periodic copies of the input points instead of 9. The main interest of our contribution is not only this result, but most of all the method itself: this new construction of covering spaces generalizes to Delaunay triangulations of surfaces of higher genus.

Hyperbolic Delaunay complexes and Voronoi diagrams made practical

Participants : Mikhail Bogdanov, Olivier Devillers, Monique Teillaud.

We study Delaunay complexes and Voronoi diagrams in the Poincaré ball, a confomal model of the hyperbolic space, in any dimension. We elaborate on our earlier work on the space of spheres [65] , giving a detailed description of algorithms, and presenting a static and a dynamic variants. All proofs are based on geometric reasoning, they do not resort to any use of the analytic formula of the hyperbolic distance. We also study algebraic and arithmetic issues, observing that only rational computations are needed. This allows for an exact and efficient implementation in 2D. All degenerate cases are handled. The implementation will be submitted to the cgal editorial board for future integration into the cgal library [44] .

The stability of Delaunay triangulations

Participants : Jean-Daniel Boissonnat, Ramsay Dyer.

In collaboration with Arijit Ghosh (Indian Statistical Institute, Kolkata, India)

We introduce a parametrized notion of genericity for Delaunay triangulations which, in particular, implies that the Delaunay simplices of δ-generic point sets are thick. Equipped with this notion, we study the stability of Delaunay triangulations under perturbations of the metric and of the vertex positions. We quantify the magnitude of the perturbations under which the Delaunay triangulation remains unchanged. We also present an algorithm that takes as input a discrete point set in m , and performs a small perturbation that guarantees that the Delaunay triangulation of the resulting perturbed point set has quantifiable stability with respect to the metric and the point positions. There is also a guarantee on the quality of the simplices: they cannot be too flat. The algorithm provides an alternative tool to the weighting or refinement methods to remove poorly shaped simplices in Delaunay triangulations of arbitrary dimension, but in addition it provides a guarantee of stability for the resulting triangulation [21] , [47] .

Constructing intrinsic Delaunay triangulations of submanifolds

Participants : Jean-Daniel Boissonnat, Ramsay Dyer.

In collaboration with Arijit Ghosh (Indian Statistical Institute, Kolkata, India)

This work is the algorithmic counterpart of our previous paper [21] . We describe an algorithm to construct an intrinsic Delaunay triangulation of a smooth closed submanifold of Euclidean space. We also provide a counterexample to the results announced by Leibon and Letscher on Delaunay triangulations on Riemannian manifolds. In general the nerve of the intrinsic Voronoi diagram is not homeomorphic to the manifold. The density of the sample points alone cannot guarantee the existence of a Delaunay triangulation. To circumvent this issue, we use results established in our companion paper on the stability of Delaunay triangulations on δ-generic point sets. We establish sampling criteria which ensure that the intrinsic Delaunay complex coincides with the restricted Delaunay complex and also with the recently introduced tangential Delaunay complex. The algorithm generates a point set that meets the required criteria while the tangential complex is being constructed. In this way the computation of geodesic distances is avoided, the runtime is only linearly dependent on the ambient dimension, and the Delaunay complexes are guaranteed to be triangulations of the manifold [46] .

Equating the witness and restricted Delaunay complexes

Participants : Jean-Daniel Boissonnat, Ramsay Dyer, Steve Oudot.

In collaboration with Arijit Ghosh (Indian Statistical Institute, Kolkata, India)

It is a well-known fact that the restricted Delaunay and witness complexes may differ when the landmark and witness sets are located on submanifolds of Rd of dimension 3 or more. Currently, the only known way of overcoming this issue consists of building some crude superset of the witness complex, and applying a greedy sliver exudation technique on this superset. Unfortunately, the construction time of the superset depends exponentially on the ambient dimension, which makes the witness complex based approach to manifold reconstruction impractical. This work provides an analysis of the reasons why the restricted Delaunay and witness complexes fail to include each other. From this, a new set of conditions naturally arises under which the two complexes are equal [37] .

Simpler complexity analysis of random geometric structures

Participants : Olivier Devillers, Marc Glisse.

In collaboration with Xavier Goaoc (EPI vegas ).

Average-case analysis of data-structures or algorithms is commonly used in computational geometry when the, more classical, worst-case analysis is deemed overly pessimistic. Since these analyses are often intricate, the models of random geometric data that can be handled are often simplistic and far from “realistic inputs”. We present a new simple scheme for the analysis of geometric structures. While this scheme only produces results up to a polylog factor, it is much simpler to apply than the classical techniques and therefore succeeds in analyzing new input distributions related to smoothed complexity analysis.

We illustrate our method on two classical structures: convex hulls and Delaunay triangulations. Specifically, we give short and elementary proofs of the classical results that n points uniformly distributed in a ball in d have a convex hull and a Delaunay triangulation of respective expected complexities Θ ˜(n d-1 d+1 ) and Θ ˜(n). We then prove that if we start with n points well-spread on a sphere, e.g. an (ϵ,κ)-sample of that sphere, and perturb that sample by moving each point randomly and uniformly within distance at most δ of its initial position, then the expected complexity of the convex hull of the resulting point set is Θ ˜(n) 1-1 d 1 δ 4 d-1 d  [55] .

Analysis of cone vertex walk in Poisson Delaunay triangulation

Participants : Olivier Devillers, Ross Hemsley.

In collaboration with Nicolas Broutin (EPI rap ).

Walking strategies are a standard tool for point location in a triangulation of size n. Although often claimed to be Θ(n) under random distribution hypotheses, this conjecture has only been formally proved by Devroye, Lemaire, and Moreau [Comp Geom–Theor Appl, vol. 29, 2004], in the case of the so called straight walk which has the very specific property that deciding whether a given (Delaunay) triangle belongs to the walk may be determined without looking at the other sites. We analyze a different walking strategy that follows vertex neighbour relations to move towards the query. We call this walk cone vertex walk. We prove that cone vertex walk visits Θ(n) vertices and can be constructed in Θ(n) time. We provide explicit bounds on the hidden constants [50] .

The monotonicity of f-vectors of random polytopes

Participants : Olivier Devillers, Marc Glisse.

In collaboration with Xavier Goaoc and Guillaume Moroz (EPI vegas ) and Matthias Reitzner (Universität Osnabrück, Germany).

Let K be a compact convex body in d , let K n be the convex hull of n points chosen uniformly and independently in K, and let f i (K n ) denote the number of i-dimensional faces of K n .

We show that for planar convex sets, E[f 0 (K n )] is increasing in n. In dimension d3, we prove that if lim n E[f d-1 (K n )] An c =1 for some constants A and c>0 then the function nE[f d-1 (K n )] is increasing for n large enough. In particular, the number of facets of the convex hull of n random points distributed uniformly and independently in a smooth compact convex body is asymptotically increasing. Our proof relies on a random sampling argument [57] .

Efficient Monte Carlo sampler for detecting parametric objects in large scenes

Participants : Florent Lafarge, Yannick Verdie.

Point processes have demonstrated efficiency and competitiveness when addressing object recognition problems in vision. However, simulating these mathematical models is a difficult task, especially on large scenes. Existing samplers suffer from average performances in terms of computation time and stability. We propose a new sampling procedure based on a Monte Carlo formalism. Our algorithm exploits Markovian properties of point processes to perform the sampling in parallel. This procedure is embedded into a data-driven mechanism such that the points are non-uniformly distributed in the scene. The performances of the sampler are analyzed through a set of experiments on various object recognition problems from large scenes, and through comparisons to the existing algorithms [35] , [63] .