Our daily life environment is increasingly interacting with digital information. An important amount of this information is of geometric nature. It concerns the representation of our environment, the analysis and understanding of “real” phenomena, the control of physical mechanisms or processes. The interaction between physical and digital worlds is two-way. Sensors are producing digital data related to measurements or observations of our environment. Digital models are also used to “act” on the physical world. Objects that we use at home, at work, to travel, such as furniture, cars, planes, ... are nowadays produced by industrial processes which are based on digital representation of shapes. CAD-CAM (Computer Aided Design – Computer Aided Manufacturing) software is used to represent the geometry of these objects and to control the manufacturing processes which create them. The construction capabilities themselves are also expanding, with the development of 3D printers and the possibility to create daily-life objects “at home” from digital models.

The impact of geometry is also important in the analysis and understanding of phenomena. The 3D conformation of a molecule explains its biological interaction with other molecules. The profile of a wing determines its aeronautic behavior, while the shape of a bulbous bow can decrease significantly the wave resistance of a ship. Understanding such a behavior or analyzing a physical phenomenon can nowadays be achieved for many problems by numerical simulation. The precise representation of the geometry and the link between the geometric models and the numerical computation tools are closely related to the quality of these simulations. This also plays an important role in optimisation loops where the numerical simulation results are used to improve the “performance” of a model.

Geometry deals with structured and efficient representations of information and with methods to treat it. Its impact in animation, games and VAMR (Virtual, Augmented and Mixed Reality) is important. It also has a growing influence in e-trade where a consumer can evaluate, test and buy a product from its digital description. Geometric data produced for instance by 3D scanners and reconstructed models are nowadays used to memorize old works in cultural or industrial domains.

Geometry is involved in many domains (manufacturing, simulation, communication, virtual world...), raising many challenging questions related to the representations of shapes, to the analysis of their properties and to the computation with these models. The stakes are multiple: the accuracy in numerical engineering, in simulation, in optimization, the quality in design and manufacturing processes, the capacity of modeling and analysis of physical problems.

The accurate description of shapes is a long standing problem in mathematics, with an important impact in many domains, inducing strong interactions between geometry and computation. Developing precise geometric modeling techniques is a critical issue in CAD-CAM. Constructing accurate models, that can be exploited in geometric applications, from digital data produced by cameras, laser scanners, observations or simulations is also a major issue in geometry processing. A main challenge is to construct models that can capture the geometry of complex shapes, using few parameters while being precise.

Our first objective is to develop methods, which are able to describe accurately and in an efficient way, objects or phenomena of geometric nature, using algebraic representations.

The approach followed in CAGD, to describe complex geometry is based on parametric representations called NURBS (Non Uniform Rational B-Spline). The models are constructed by trimming and gluing together high order patches of algebraic surfaces. These models are built from the so-called B-Spline functions that encode a piecewise algebraic function with a prescribed regularity at knots. Although these models have many advantages and have become the standard for designing nowadays CAD models, they also have important drawbacks. Among them, the difficulty to locally refine a NURBS surface and also the topological rigidity of NURBS patches that imposes to use many such patches with trims for designing complex models, with the consequence of the appearing of cracks at the seams. To overcome these difficulties, an active area of research is to look for new blending functions for the representation of CAD models. Some examples are the so-called T-Splines, LR-Spline blending functions, or hierarchical splines, that have been recently devised in order to perform efficiently local refinement. An important problem is to analyze spline spaces associated to general subdivisions, which is of particular interest in higher order Finite Element Methods. Another challenge in geometric modeling is the efficient representation and/or reconstruction of complex objects, and the description of computational domains in numerical simulation. To construct models that can represent efficiently the geometry of complex shapes, we are interested in developing modeling methods, based on alternative constructions such as skeleton-based representations. The change of representation, in particular between parametric and implicit representations, is of particular interest in geometric computations and in its applications in CAGD.

We also plan to investigate adaptive hierarchical techniques, which can locally improve the approximation of a shape or a function. They shall be exploited to transform digital data produced by cameras, laser scanners, observations or simulations into accurate and structured algebraic models.

The precise and efficient representation of shapes also leads to the problem of
extracting and exploiting characteristic properties of shapes such as symmetry,
which is very frequent in geometry.
Reflecting the symmetry of the intended shape in the
representation appears as a natural requirement for visual quality,
but also as a possible source of sparsity of the representation.
Recognizing, encoding and exploiting symmetry requires new paradigms
of representation and further algebraic developments.
Algebraic foundations for the exploitation of symmetry in the context of non linear differential and polynomial equations are addressed.
The intent is to bring this expertise with symmetry to the geometric
models and computations developed by aromath.

In many problems, digital data are approximated and cannot just be
used as if they were exact. In the context of geometric modeling,
polynomial equations appear naturally as a way to describe
constraints between the unknown variables of a problem. An
important challenge is to take into account the input error in order
to develop robust methods for solving these algebraic constraints.
Robustness means that a small perturbation of the input should produce
a controlled variation of the output, that is forward stability, when
the input-output map is regular. In non-regular cases,
robustness also means that the output is an exact solution, or the
most coherent solution, of a problem with input
data in a given neighborhood, that is backward stability.

Our second long term objective is to develop methods to robustly and efficiently solve algebraic problems that occur in geometric modeling.

Robustness is a major issue in geometric modeling and algebraic computation. Classical methods in computer algebra, based on the paradigm of exact computation, cannot be applied directly in this context. They are not designed for stability against input perturbations. New investigations are needed to develop methods which integrate this additional dimension of the problem. Several approaches are investigated to tackle these difficulties.

One relies on linearization of algebraic problems based on “elimination of variables” or projection into a space of smaller dimension. Resultant theory provides a strong foundation for these methods, connecting the geometric properties of the solutions with explicit linear algebra on polynomial vector spaces, for families of polynomial systems (e.g., homogeneous, multi-homogeneous, sparse). Important progress has been made in the last two decades to extend this theory to new families of problems with specific geometric properties. Additional advances have been achieved more recently to exploit the syzygies between the input equations. This approach provides matrix based representations, which are particularly powerful for approximate geometric computation on parametrized curves and surfaces. They are tuned to certain classes of problems and an important issue is to detect and analyze degeneracies and to adapt them to these cases.

A more adaptive approach involves linear algebra computation in a hierarchy of polynomial vector spaces. It produces a description of quotient algebra structures, from which the solutions of polynomial systems can be recovered. This family of methods includes Gröbner Basis , which provides general tools for solving polynomial equations. Border Basis is an alternative approach, offering numerically stable methods for solving polynomial equations with approximate coefficients . An important issue is to understand and control the numerical behavior of these methods as well as their complexity and to exploit the structure of the input system.

In order to compute “only” the (real) solutions of a polynomial system in a given domain, duality techniques can also be employed. They consist in analyzing and adding constraints on the space of linear forms which vanish on the polynomial equations. Combined with semi-definite programming techniques, they provide efficient methods to compute the real solutions of algebraic equations or to solve polynomial optimization problems. The main issues are the completness of the approach, their scalability with the degree and dimension and the certification of bounds.

Singular solutions of polynomial systems can be analyzed by computing differentials, which vanish at these points. This leads to efficient deflation techniques, which transform a singular solution of a given problem into a regular solution of the transformed problem. These local methods need to be combined with more global root localisation methods.

Subdivision methods are another type of methods which are interesting for robust geometric computation. They are based on exclusion tests which certify that no solution exists in a domain and inclusion tests, which certify the uniqueness of a solution in a domain. They have shown their strength in addressing many algebraic problems, such as isolating real roots of polynomial equations or computing the topology of algebraic curves and surfaces. The main issues in these approaches is to deal with singularities and degenerate solutions.

The main domain of applications that we consider for the methods we develop is Computer Aided Design and Manufacturing.

Computer-Aided Design (CAD) involves creating digital models defined by mathematical constructions, from geometric, functional or aesthetic considerations. Computer-aided manufacturing (CAM) uses the geometrical design data to control the tools and processes, which lead to the production of real objects from their numerical descriptions.

CAD-CAM systems provide tools for visualizing, understanding, manipulating, and editing virtual shapes. They are extensively used in many applications, including automotive, shipbuilding, aerospace industries, industrial and architectural design, prosthetics, and many more. They are also widely used to produce computer animation for special effects in movies, advertising and technical manuals, or for digital content creation. Their economic importance is enormous. Their importance in education is also growing, as they are more and more used in schools and educational purposes.

CAD-CAM has been a major driving force for research developments in geometric modeling, which leads to very large software, produced and sold by big companies, capable of assisting engineers in all the steps from design to manufacturing.

Nevertheless, many challenges still need to be addressed. Many problems remain open, related to the use of efficient shape representations, of geometric models specific to some application domains, such as in architecture, naval engineering, mechanical constructions, manufacturing ...Important questions on the robustness and the certification of geometric computation are not yet answered. The complexity of the models which are used nowadays also appeals for the development of new approaches. The manufacturing environment is also increasingly complex, with new type of machine tools including: turning, 5-axes machining and wire EDM (Electrical Discharge Machining), 3D printer. It cannot be properly used without computer assistance, which raises methodological and algorithmic questions. There is an increasing need to combine design and simulation, for analyzing the physical behavior of a model and for optimal design.

The field has deeply changed over the last decades, with the emergence of new geometric modeling tools built on dedicated packages, which are mixing different scientific areas to address specific applications. It is providing new opportunities to apply new geometric modeling methods, output from research activities.

A major bottleneck in the CAD-CAM developments is the lack of interoperability of modeling systems and simulation systems. This is strongly influenced by their development history, as they have been following different paths.

The geometric tools have evolved from supporting a limited number of tasks at separate stages in product development and manufacturing, to being essential in all phases from initial design through manufacturing.

Current Finite Element Analysis (FEA) technology was already well
established 40 years ago, when CAD-systems just started to
appear, and its success stems from using approximations of both the
geometry and the analysis model with low order finite elements (most
often of degree

There has been no requirement between CAD and numerical simulation, based on Finite Element Analysis, leading to incompatible mathematical representations in CAD and FEA. This incompatibility makes interoperability of CAD/CAM and FEA very challenging. In the general case today this challenge is addressed by expensive and time-consuming human intervention and software developments.

Improving this interaction by using adequate geometric and functional descriptions should boost the interaction between numerical analysis and geometric modeling, with important implications in shape optimization. In particular, it could provide a better feedback of numerical simulations on the geometric model in a design optimization loop, which incorporates iterative analysis steps.

The situation is evolving. In the past decade, a new paradigm has emerged to replace the traditional Finite Elements by B-Spline basis element of any polynomial degree, thus in principle enabling exact representation of all shapes that can be modeled in CAD. It has been demonstrated that the so-called isogeometric analysis approach can be far more accurate than traditional FEA.

It opens new perspectives for the interoperability between geometric modeling and numerical simulation. The development of numerical methods of high order using a precise description of the shapes raises questions on piecewise polynomial elements, on the description of computational domains and of their interfaces, on the construction of good function spaces to approximate physical solutions. All these problems involve geometric considerations and are closely related to the theory of splines and to the geometric methods we are investigating. We plan to apply our work to the development of new interactions between geometric modeling and numerical solvers.

Ioannis Emiris was elected President and General director of Research and Innovation Center "Athena", a Greek, public, nation-wide research organization based in Athens, Greece.

The development of the G+Smo library (Geometry plus Simulation modules) has been advanced significantly by implementing numerical simulation codes for linear and non-linear shell analysis using a variety of spline basis functions. In 2021 the a release v21.12 of the library was created.

In

25, we present a new algorithm for computing the real radical of an ideal

and, more generally, the

-radical of

, which is based on convex moment optimization. A truncated positive generic linear functional

vanishing on the generators of

is computed solving a Moment Optimization Problem (MOP). We show that, for a large enough degree of truncation, the annihilator of

generates the real radical of

. We give an effective, general stopping criterion on the degree to detect when the prime ideals lying over the annihilator are real and compute the real radical as the intersection of real prime ideals lying over

. The method involves several ingredients, that exploit the properties of generic positive moment sequences. A new efficient algorithm is proposed to compute a graded basis of the annihilator of a truncated positive linear functional. We propose a new algorithm to check that an irreducible decomposition of an algebraic variety is real, using a generic real projection to reduce to the hypersurface case. There we apply the Sign Changing Criterion, effectively performed with an exact MOP. Finally we illustrate our approach in some examples.

In 17 we exploit structure in polynomial system solving by considering polynomials that are linear in subsets of the variables. We focus on algorithms and their Boolean complexity for computing isolating hyperboxes for all the isolated complex roots of well-constrained, unmixed systems of multilinear polynomials based on resultant methods. We enumerate all expressions of the multihomogeneous (or multigraded) resultant of such systems as a determinant of Sylvester-like matrices, aka generalized Sylvester matrices. We construct these matrices by means of Weyman homological complexes, which generalize the Cayley-Koszul complex. The computation of the determinant of the resultant matrix is the bottleneck for the overall complexity. We exploit the quasi-Toeplitz structure to reduce the problem to efficient matrix-vector multiplication, which corresponds to multivariate polynomial multiplication, by extending the seminal work on Macaulay matrices of Canny, Kaltofen, and Yagati to the multi-homogeneous case. We compute a rational univariate representation of the roots, based on the primitive element method. We present an algorithmic variant to compute the isolated roots of overdetermined and positive-dimensional systems. Thus our algorithms and complexity analysis apply in general with no assumptions on the input.

Interpolation is a prime tool in algebraic computation while symmetry is a qualitative feature that can be more relevant to a mathematical model than the numerical accuracy of the parameters. The article 21 shows how to exactly preserve symmetry in multivariate interpolation while exploiting it to alleviate the computational cost. We revisit minimal degree and least interpolation with symmetry adapted bases, rather than monomial bases. For a space of linear forms invariant under a group action, we construct bases of invariant interpolation spaces in blocks, capturing the inherent redundancy in the computations. With the so constructed symmetry adapted interpolation bases, the uniquely defined interpolant automatically preserves any equivariance the interpolation problem might have. Even with no equivariance, the computational cost to obtain the interpolant is alleviated thanks to the smaller size of the matrices to be inverted.

Sparse interpolation refers to the exact recovery of a function as a short linear combination of basis functions from a limited number of evaluations. For multivariate functions, the case of the monomial basis is well studied, as is now the basis of exponential functions. Beyond the multivariate Chebyshev polynomial obtained as tensor products of univariate Chebyshev polynomials, the theory of root systems allows to define a variety of generalized multivariate Chebyshev polynomials that have connections to topics such as Fourier analysis and representations of Lie algebras. The article 18 presents a deterministic algorithm to recover a function that is the linear combination of at most r such polynomials from the knowledge of

This is joint work with Michael Singer, North Carolina State University, USA.

We employ the m-Bézout Bound for Distance Geometry problems, namely for estimating the number of embeddings of a simple weighted graph 26. We use combinatorial methods based on our previous work that gave the first nontrivial bounds for this problem, thus offering the first improvement upon Bezout's bound after a few decades of research in this area. In this paper, we moreover examine the opposite direction of using combinatorial bounds to estimate the number of roots of a multihomogenous well-constrained system when the latter has some extra structure.

This joint work is based on I. Emiris' invited plenary talk at CASC 2021.

Effective computation of resultants is a central problem in elimination theory and polynomial system solving. Commonly, we compute the resultant as a quotient of determinants of matrices and we say that there exists a determinantal formula when we can express it as a determinant of a matrix whose elements are the coefficients of the input polynomials. In 12, we study the resultant in the context of mixed multilinear polynomial systems, that is multilinear systems with polynomials having different supports, on which determinantal formulas were not known. We construct determinantal formulas for two kind of multilinear systems related to the Multiparameter Eigenvalue Problem (MEP): first, when the polynomials agree in all but one block of variables; second, when the polynomials are bilinear with different supports, related to a bipartite graph. We use the Weyman complex to construct Koszul-type determinantal formulas that generalize Sylvester-type formulas. We can use the matrices associated to these formulas to solve square systems without computing the resultant. The combination of the resultant matrices with the eigenvalue and eigenvector criterion for polynomial systems leads to a new approach for solving MEP.

A tensor product surface

This is joint work with Falai Chen (University of Science and Technology of China at Hefei).

For a reduced hypersurface

This is joint work with Alexandru Dimca (UCA), Hal Schenck (Department of Mathematics and Statistics at Auburn University) and Gabriel Sticlaru (Faculty of Mathematics and Informatics at Constanta).

Many global implicit surface reconstruction algorithms formulate the problem as a volumetric energy minimization, trading data fitting for geometric regularization. As a result, the output surfaces may be located arbitrarily far away from the input samples. This is amplified when considering i) strong regularization terms, ii) sparsely distributed samples or iii) missing data. This breaks the strong assumption commonly used by popular octree-based and triangulation-based approaches that the output surface should be located near the input samples. As these approaches refine during a pre-process, their cells near the input samples, the implicit solver deals with a domain discretization not fully adapted to the final isosurface. In 24, we relax this assumption and propose a progressive coarse-to-fine approach that jointly refines the implicit function and its representation domain, through iterating solver, optimization and refinement steps applied to a 3D Delaunay triangulation. There are several advantages to this approach: the discretized domain is adapted near the isosurface and optimized to improve both the solver conditioning and the quality of the output surface mesh contoured via marching tetrahedra.

This is joint work with Tong Zhao (Titane), Pierre Alliez (Titane), Tamy Boubekeur (LTCI) and Jean-Marc Thiery (LTCI).

CAD models represented by NURBS surface patches are often hampered with defects due to inaccurate representations of trimming curves. Such defects make these models unsuitable to the direct generation of valid volume meshes, and often require trial-and-error processes to fix them. In 23, we propose a fully automated Delaunay-based meshing approach which can mesh and repair simultaneously, while being independent of the input NURBS patch layout. Our approach proceeds by Delaunay filtering and refinement, in which trimmed areas are repaired through implicit surfaces. Beyond repair, we demonstrate its capability to smooth out sharp features, defeature small details, and mesh multiple domains in contact.

This is joint work with Xiao Xiao (Titane), Pierre Alliez (Titane) and Laurent Rineau (GeometryFactory).

In

38, we consider the Voronoï diagram of a finite family of parallel half-lines, with the same orientation, constrained to a compact domain

, with respect to the Euclidean distance. We present an efficient approximation algorithm for computing such VD, using a subdivision process, which produces a mesh representing the topology of the VD in

. The computed topology may not be correct for degenerate configurations or configurations close to degenerate. In this case, the output is a valid partition, which is close to the exact partition in Voronoïcells if the input data were given with no error. We also present the result of an implementation in Julia language with visualization using Axl software (axl.inria.fr) of the algorithm. Some examples and analysis are shown.

This is a joint work with Ibrahim Adamou, Departement de Mathématiques, Faculté des Sciences et Techniques, Université Dan Dicko Dankoulodo de Maradi.

The Symmetric Tensor Approximation problem (STA) consists of approximating a symmetric tensor or a homogeneous polynomial by a linear combination of symmetric rank-1 tensors or powers of linear forms of low symmetric rank. In

19, we present two new Riemannian Newton-type methods for low rank approximation of symmetric tensor with complex coefficients. The first method uses the parametrization of the set of tensors of rank at most

by weights and unit vectors. Exploiting the properties of the apolar product on homogeneous polynomials combined with efficient tools from complex optimization, we provide an explicit and tractable formulation of the Riemannian gradient and Hessian, leading to Newton iterations with local quadratic convergence. We prove that under some regularity conditions on non-defective tensors in the neighborhood of the initial point, the Newton iteration (completed with a trust-region scheme) is converging to a local minimum. The second method is a Riemannian Gauss–Newton method on the Cartesian product of Veronese manifolds. An explicit orthonormal basis of the tangent space of this Riemannian manifold is described. We deduce the Riemannian gradient and the Gauss–Newton approximation of the Riemannian Hessian. We present a new retraction operator on the Veronese manifold. We analyze the numerical behavior of these methods, with an initial point provided by Simultaneous Matrix Diagonalisation (SMD). Numerical experiments show the good numerical behavior of the two methods in different cases and in comparison with existing state-of-the-art methods.

This is a joint work with Houssam Khalil, Faculty of Sciences, Lebanese University.

In 29 we develop geometric methods for modeling asset allocations. Portfolios correspond to points in a simplex whose dimension equals the number of assets or stocks, assuming the total invested quantity is fixed. This allows us to model Crisis Periods in Stock Markets by computing the tradeoff between expected returns and the volatility (variance) of returns by means of copulas. The computation of copulas uses the aforementioned geometric tools. In 16 we go one step forward in portfolio management: Estimating the probability distribution for achieving certain returns leads us to a new portfolio performance score which reflects the probability of sufficiently high profit.

In 15 we develop neural networks adapted to the problem of wind speed forecasting. The input are classical meteorological predictions, and the training is undertaken over more than 12 months in specific locations in Greece, in collaboration with a wind power producer company. We introduce a novel architecture that combines CNN for capturing the location's landscape and Recurrent NN (RNN) since wind speed is a time-dependent phenomenon. Our results improve the published state-of-the-art for 24-hour predictions. Note that our model does rely on previous day information as most models do today.

In 28 we employ 2D kinetic Voronoi diagrams for Autonomous and semi-autonomous ship routing in ports, straits and other narrow domains. This relies on CGAL packages for such operations. Second, we propose capacitated vehicle routing algorithms with time-windows for optimized ship routing: these methods have been developed for land routing when vehicles have several capacity and loading constraints, and we expect that they may be similarly useful in shipping over a graph representing the various ports to be visited, including one or more depots or loading ports.

In 27 we pursue the purpose of providing means to physicians for automated and fast recognition of airways diseases. In this work, we mainly focus on measures that can be easily recorded using a spirometer. The signals used in this framework are simulated using the linear bi-compartment model of the lungs. This allows us to simulate ventilation under the hypothesis of ventilation at rest (tidal breathing). By changing the resistive and elastic parameters, data samples are realized simulating healthy, fibrosis and asthma breathing. On this synthetic data, different machine learning models are tested and their performance is assessed. All but the Naive bias classifier show accuracy of at least 99%. This represents a proof of concept that Machine Learning can accurately differentiate diseases based on manufactured spirometry data. This paves the way for further developments on the topic, notably testing the model on real data.

Modelling nonlinear phenomena in thin rubber shells calls for stretch-based material models, such as the Ogden model which conveniently utilizes eigenvalues of the deformation tensor. Derivation and implementation of such models have been already made in Finite Element Methods. This is, however, still lacking in shell formulations based on Isogeometric Analysis, where higher-order continuity of the spline basis is employed for improved accuracy. The article 22 fills this gap by presenting formulations of stretch-based material models for isogeometric Kirchhoff-Love shells. We derive general formulations based on explicit treatment in terms of derivatives of the strain energy density functions with respect to principal stretches for (in)compressible material models where determination of eigenvalues as well as the spectral basis transformations is required. Using several numerical benchmarks, we verify our formulations on invariant-based Neo-Hookean and Mooney-Rivlin models and with a stretch-based Ogden model. In addition, the model is applied to simulate collapsing behaviour of a truncated cone and it is used to simulate tension wrinkling of a thin sheet.

Shape optimization based on Isogeometric Analysis (IGA) has gained popularity in recent years. Performing shape optimization directly over parameters defining the CAD geometry, such as for example the control points of a spline parametrization, opens up the prospect of seamless integration of a shape optimization step into the CAD workflow. One of the challenges when using IGA for shape optimization is that of maintaining a valid geometry parametrization of the interior of the domain during an optimization process, as the shape of the boundary is altered by an optimization algorithm. Existing methods impose constraints on the Jacobian of the parametrization, to guarantee that the parametrization remains valid. The number of such validity constraints quickly becomes intractably large, especially when 3D shape optimization problems are considered. An alternative, and arguably simpler, approach is to formulate the isogeometric shape optimization problem in terms of both the boundary and the interior control points. In order to ensure a geometric parametrization of sufficient quality a regularization term, such as the Winslow functional, is added to the objective function of the shape optimization problem. In 20 we illustrate the performance of these methods on the optimal design problem of electromagnetic reflectors and compare their performance. Both methods are implemented for multipatch geometries, using the IGA library G+Smo and the optimization library Ipopt. We find that the second approach performs comparably to a state of the art method with respect to both the quality of the found solutions and computational time, while its performance in our experience is more robust for coarse discretizations.

We have a research contract with the industrial partner GeometryFactory, in collaboration with the project-team Titane (Pierre Alliez). The post-doc of Xiao Xiao is funded by this research contract together with a PEPS from the labex AMIES. We continue to develop a robust algorithm which offers the capability to repair while meshing CAD models, without requiring a valid surface mesh as input, and to generate volume meshes which are valid by design and independent of the input NURBS patch layout, without resorting to post-processing steps such as mesh quilting or remeshing.

Ioannis Emiris coordinates a research contract with the industrial partner ANSYS (Greece), in collaboration with Athena Research Center. MSc students P. Repouskos and T. Pappas, PhD candidate A. Chalkis and postdoc fellow I. Psarros are partially funded.

Electronic design automation (EDA) and simulating Integrated Circuits requires robust geometric operations on thousands of electronic elements (capacitors, resistors, coils etc) represented by polyhedral objects in 2.5 dimensions, not necessarily convex. A special case may concern axis-aligned objects but the real challenge is the general case. The project, extended into 2021, focuses on 3 axes: (1) efficient data structures and prototype implementations for storing the aforementioned polyhedral objects so that nearest neighbor queries are fast in the L-max metric, which is the primary focus of the contract, (2) random sampling of the free space among objects, (3) data-driven algorithmic design for problems concerning data-structures and their construction and initialization.

CIFRE collaboration between Schlumberger Montpellier (A. Azzedine) and Inria Sophia Antipolis (B. Mourrain). The PhD candidate is A. Belhachmi. The objective of the work is the development of a new spline based high quality geomodeler for reconstructing the stratigraphy of geological layers from the adaptive and efficient processing of large terrain information.

Non-linear optimization problems are present in many real-life applications and in scientific areas such as operations research, control engineering, physics, information processing, economy, biology, etc. However, efficient computational procedures, that can provide the guaranteed global optimum, are lacking for them. The project will develop new polynomial optimization methods, combining moment relaxation procedures with computational algebraic tools to address this type of problems. Recent advances in mathematical programming have shown that the polynomial optimization problems can be approximated by sequences of Semi-Definite Programming problems. This approach provides a powerful way to compute global solutions of non-linear optimization problems and to guarantee the quality of computational results. On the other hand, advanced algebraic algorithms to compute all the solutions of polynomial systems, with efficient implementations for exact and approximate solutions, were developed in the past twenty years. The network combines the expertise of active European teams working in these two domains to address important challenges in polynomial optimization and to show the impact of this research on practical applications.

POEMA aims to train scientists at the interplay of algebra, geometry and computer science for polynomial optimization problems and to foster scientific and technological advances, stimulating interdisciplinary and intersectoriality knowledge exchange between algebraists, geometers, computer scientists and industrial actors facing real-life optimization problems.

GRAPES

GRAPES aims at considerably advancing the state of the art in Mathematics, CAD, and Machine Learning in order to promote game changing approaches for generating, optimising, and learning 3D shapes, along with a multisectoral training for young researchers. The scientific goals of GRAPES rely on a multidisciplinary consortium composed of leaders in their respective fields. Top-notch research is also instrumental in forming the new generation of European scientists and engineers. Their disciplines span the spectrum from Computational Mathematics, Numerical Analysis, and Algorithm Design, up to Geometric Modelling, Shape Optimisation, and Deep Learning. This allows the 15 PhD candidates to follow either a theoretical or an applied track and to gain knowledge from both research and innovation through a nexus of intersectoral secondments and Network-wide workshops.

Horizontally, our results lead to open-source, prototype implementations, software integrated into commercial libraries as well as open benchmark datasets. These are indispensable for dissemination and training but also to promote innovation and technology transfer. Innovation relies on the active participation of SMEs, either as a beneficiary hosting an ESR or as associate partners hosting secondments. Concrete applications include simulation and fabrication, hydrodynamics and marine design, manufacturing and 3D printing, retrieval and mining, reconstruction and urban planning.

GdR EFI and GDM: Evelyne Hubert is part of the Scientific Committee of the GdR Equations Fonctionnelles et Interactions (gdrefi.math.cnrs.fr) and participates to the GdR Géometrie Differentielle et Mécanique (gdr-gdm.univ-lr.fr).

Laurent Busé, Angelos Mantzaflaris and Bernard Mourrain co-organized the first Software and Industrial GRAPES Workshop December 6 - 10, 2021 at the premises of Inria Sophia. It incuded lectures and demo sessions on computational geometry and geometry processing (CGAL), isogeometric analysis (G+Smo) and machine learning (PyTorch). Industrial actors will also gave talks and discussed some of their open problems.

Evelyne Hubert was a co-organizer for the FoCM webinar that run monthly in 2021. After the cancelation of the FoCM conference in 2020, the webinar was conceived to feature the then planned plenary speakers. The talks were made available here.

Bernard Mourrain co-organized with M. Laurent (CWI) and V. Magron (LAAS) minisymposia on Positive Polynomials, Moments, and Applications at SIAM Conference on Applied Algebraic Geometry (AG21) (Aug 16 - 20, 2021, www.siam.org/conferences/cm/conference/ag21).

Laurent Busé co-organized the French Computer Algebra Days that took place at the CIRM, Luminy, March 1-5. He also co-organized, with D. Faenzi and A. Horing, a 3-days meeting at the math laboratory of UCA titled "Rencontre autour de syzygies Jacobiennes", June 29-July 1st. He also co-organized the AI & Companies Week (Mathematical Study Groups with Industry), November 22-26, 2021, at Campus SophiaTech, Sophia Antipolis, France.

Angelos Mantzaflaris was in the program committee of SPM and GMP conferences.

Bernard Mourrain was in the program committee of SPM and GMP conferences.

Bernard Mourrain reviewed for ISSAC, MEGA and SPM conferences.

Laurent Busé reviewed for SPM and SIGGRAPH conferences.

Angelos Mantzaflaris reviewed for SIAM GD, ISSAC, SPM and GMP conferences.

Evelyne Hubert is on the editorial boards of the journal Foundation of Computational Mathematics
and the Journal of Symbolic Computation. She is an appointed reviewer for the Mathematical Reviews (mathSciNet).

Bernard Mourrain is associate editor of the Journal of Symbolic Computation and of the SIAM Journal on Applied Algebra and Geometry.

Laurent Busé is a member of the editorial team of Maple Transactions.

Evelyne Hubert was sollicited for reviews by
the journal Foundation of Computational Mathematics,
Journal of Symbolic Computation,
Symmetry and
Frontiers in Earth Science, section Environmental Informatics and Remote Sensing.

Bernard Mourrain reviewed articles for
the Journal of Algebra,
the journal for Computer Aided Geometric Design,
the Journal of Computational and Applied Mathematics,
the Journal of Constructive Approximation,
the journal Foundation of Computational Mathematics,
the Journal of Scientific Computing,
the Journal of Symbolic Computation,
the SIAM Journal on Matrix Analysis and Applications,
Transactions on Mathematical Software.

Laurent Busé reviewed articles for
the journal Mathematics of Computations,
the journal Computer Aided GEoemtric Design,
the Journal of Combinatorial Algebra,
the journal Communications in Mathematics and Statistics,
the Journal of Software for Algebra and Geometry,
the journal Annales Henri Lebesgue,
the Journal of Computational and Applied Mathematics.

Angelos Mantzaflaris reviewed articles for
the journal Computer Methods in Applied Mechanics and Engineering
the journal Computers & Mathematics with Applications
the journal Transactions on Cloud Computing
the journal Computer-Aided Design
the journal Computer Aided Geometric Design
the journal Digital Signal Processing
the journal Journal of Symbolic Computations
the journal Journal of Computational and Applied Mathematics
the journal Journal of Computational Design and Engineering.

Evelyne Hubert was a keynote speaker at the Maple Conference (fr.maplesoft.com/mapleconference/2021).
She was also an invited speaker at the workshops Algebraic Combinatorics of the Symmetric Groups and Coxeter Groups (Cetraro Italy),
Modern Analysis Related to Root Systems with Applications (CIRM Marseille), Moving Frames and their Modern Applications (BIRS, Banff Canada).

Tobias Metzlaff was an invited speaker at the workshops Modern Analysis Related to Root Systems with Applications (CIRM Marseille)
and the POEMA industrial workshop at Inria SAM.

Bernard Mourrain was invited to give a talk at the online Sanya Workshop on Algebraic Geometry and Machine Learning (26-29 Jan. 2021, sites.google.com/view/agmlsanya/home), at Go 60, Pure & Applied Algebraic Geometry celebrating online Giorgio Ottaviani's 60th birthday (21-25 Jun. 2021, staff.polito.it/ada.boralevi/GO60), at the online minisymposium on Optimization with Polynomials, Matrices and Tensors at SIAM conference on Optimization (20-23 Jul. 2021, www.siam.org/conferences/cm/conference/op21).

Laurent Busé was invited to give a talk at the Mathematical Congress of the Americas 2021 (MCA 2021), University of Buenos Aires (online event), July 19-24, 2021, at the conference Algebraic Geometry in Ischia for Alexandru Dimca's retirement, Ischia, Italy, October 11-14.

Pablo Gonzalez-Mazon was invited to give a talk at the EACA “Tapas” seminar, titled “Criterios efectivos y clasificaci ́on de transformaciones bilineales y trilineales birracionales”, May 6th, and at the Seminari de Geometria Algebraica de Barcelona titled “Trilinear birational maps”, November 12.

Lorenzo Baldi was an invited speaker at the session Positive Polynomials, Moments, and Applications in the SIAM Conference on Applied Algebra and Geometry, online, and the POEMA industrial workshop at Inria SAM.

Evelyne Hubert has been appointed to the Board of Directors of the society Foundations of Computational Mathematics for the period 2021-2023.

Bernard Mourrain was vice chair of the SIAM Algebraic Geometry group.

Laurent Busé has been elected as co-chair, with Clément Pernet, of the GDR Calcul Formel.

Evelyne Hubert was appointed by SIAM-Publications to the committee for the assessment of the Journal on Algebra and Applied Geomety.

Evelyne Hubert was part of the jury for hiring junior researchers at Inria Saclay – Île-de-France (CRCN and ISFP).

Bernard Mourrain was member of the BCEP (Bureau du Comité des Équipes Projet) of the center Inria - Sophia Antipolis.

Laurent Busé is a member of the board of AMIES.

Evelyne Hubert was a member of the jury for the PhD defense of Zhangchi Chen at Université Paris-Scalay on Differential invariants
of parabolic surfaces and of CR hypersurfaces; Directed harmonic currents near non-hyperbolic linearized singularities; Hartogs’ type extension
of holomorphic line bundles; (Non-)invertible circulant matrices.

Bernard Mourrain was a reviewer for the Habilitation à Diriger des Recherches of V. Magon at Université Toulouse 3 Paul Sabatier, entitled "The quest of modeling, certification and efficiency in polynomial optimization.

Laurent Busé was a reviewer (and member of the PhD committee) for the PhD thesis of

and a member of the PhD committee of