Our daily life environment is increasingly interacting with digital information. An important amount of this information is of geometric nature. It concerns the representation of our environment, the analysis and understanding of “real” phenomena, the control of physical mechanisms or processes. The interaction between physical and digital worlds is two-way. Sensors are producing digital data related to measurements or observations of our environment. Digital models are also used to “act” on the physical world. Objects that we use at home, at work, to travel, such as furniture, cars, planes, ... are nowadays produced by industrial processes which are based on digital representation of shapes. CAD-CAM (Computer Aided Design – Computer Aided Manufacturing) software is used to represent the geometry of these objects and to control the manufacturing processes which create them. The construction capabilities themselves are also expanding, with the development of 3D printers and the possibility to create daily-life objects “at home” from digital models.

The impact of geometry is also important in the analysis and understanding of phenomena. The 3D conformation of a molecule explains its biological interaction with other molecules. The profile of a wing determines its aeronautic behavior, while the shape of a bulbous bow can decrease significantly the wave resistance of a ship. Understanding such a behavior or analyzing a physical phenomenon can nowadays be achieved for many problems by numerical simulation. The precise representation of the geometry and the link between the geometric models and the numerical computation tools are closely related to the quality of these simulations. This also plays an important role in optimisation loops where the numerical simulation results are used to improve the “performance” of a model.

Geometry deals with structured and efficient representations of information and with methods to treat it. Its impact in animation, games and VAMR (Virtual, Augmented and Mixed Reality) is important. It also has a growing influence in e-trade where a consumer can evaluate, test and buy a product from its digital description. Geometric data produced for instance by 3D scanners and reconstructed models are nowadays used to memorize old works in cultural or industrial domains.

Geometry is involved in many domains (manufacturing, simulation, communication, virtual world...), raising many challenging questions related to the representations of shapes, to the analysis of their properties and to the computation with these models. The stakes are multiple: the accuracy in numerical engineering, in simulation, in optimization, the quality in design and manufacturing processes, the capacity of modeling and analysis of physical problems.

The accurate description of shapes is a long standing problem in mathematics, with an important impact in many domains, inducing strong interactions between geometry and computation. Developing precise geometric modeling techniques is a critical issue in CAD-CAM. Constructing accurate models, that can be exploited in geometric applications, from digital data produced by cameras, laser scanners, observations or simulations is also a major issue in geometry processing. A main challenge is to construct models that can capture the geometry of complex shapes, using few parameters while being precise.

Our first objective is to develop methods, which are able to describe accurately and in an efficient way, objects or phenomena of geometric nature, using algebraic representations.

The approach followed in Computer Aided Geometric Design (CAGD) to describe complex geometry is based on parametric representations called NURBS (Non Uniform Rational B-Spline). The models are constructed by trimming and gluing together high order patches of algebraic surfaces. These models are built from the so-called B-Spline functions that encode a piecewise algebraic function with a prescribed regularity at knots. Although these models have many advantages and have become the standard for designing nowadays CAD models, they also have important drawbacks. Among them, the difficulty to locally refine a NURBS surface and also the topological rigidity of NURBS patches that imposes to use many such patches with trims for designing complex models, with the consequence of the appearing of cracks at the seams. To overcome these difficulties, an active area of research is to look for new blending functions for the representation of CAD models. Some examples are the so-called T-Splines, LR-Spline blending functions, or hierarchical splines, that have been recently devised in order to perform efficiently local refinement. An important problem is to analyze spline spaces associated to general subdivisions, which is of particular interest in higher order Finite Element Methods. Another challenge in geometric modeling is the efficient representation and/or reconstruction of complex objects, and the description of computational domains in numerical simulation. To construct models that can represent efficiently the geometry of complex shapes, we are interested in developing modeling methods, based on alternative constructions such as skeleton-based representations. The change of representation, in particular between parametric and implicit representations, is of particular interest in geometric computations and in its applications in CAGD.

We also plan to investigate adaptive hierarchical techniques, which can locally improve the approximation of a shape or a function. They shall be exploited to transform digital data produced by cameras, laser scanners, observations or simulations into accurate and structured algebraic models.

The precise and efficient representation of shapes also leads to the problem of
extracting and exploiting characteristic properties of shapes such as symmetry,
which is very frequent in geometry.
Reflecting the symmetry of the intended shape in the
representation appears as a natural requirement for visual quality,
but also as a possible source of sparsity of the representation.
Recognizing, encoding and exploiting symmetry requires new paradigms
of representation and further algebraic developments.
Algebraic foundations for the exploitation of symmetry in the context of non linear differential and polynomial equations are addressed.
The intent is to bring this expertise with symmetry to the geometric
models and computations developed by aromath.

In many problems, digital data are approximated and cannot just be
used as if they were exact. In the context of geometric modeling,
polynomial equations appear naturally as a way to describe
constraints between the unknown variables of a problem. An
important challenge is to take into account the input error in order
to develop robust methods for solving these algebraic constraints.
Robustness means that a small perturbation of the input should produce
a controlled variation of the output, that is forward stability, when
the input-output map is regular. In non-regular cases,
robustness also means that the output is an exact solution, or the
most coherent solution, of a problem with input
data in a given neighborhood, that is backward stability.

Our second long term objective is to develop methods to robustly and efficiently solve algebraic problems that occur in geometric modeling.

Robustness is a major issue in geometric modeling and algebraic computation. Classical methods in computer algebra, based on the paradigm of exact computation, cannot be applied directly in this context. They are not designed for stability against input perturbations. New investigations are needed to develop methods which integrate this additional dimension of the problem. Several approaches are investigated to tackle these difficulties.

One relies on linearization of algebraic problems based on “elimination of variables” or projection into a space of smaller dimension. Resultant theory provides a strong foundation for these methods, connecting the geometric properties of the solutions with explicit linear algebra on polynomial vector spaces, for families of polynomial systems (e.g., homogeneous, multi-homogeneous, sparse). Important progress has been made in the last two decades to extend this theory to new families of problems with specific geometric properties. Additional advances have been achieved more recently to exploit the syzygies between the input equations. This approach provides matrix based representations, which are particularly powerful for approximate geometric computation on parametrized curves and surfaces. They are tuned to certain classes of problems and an important issue is to detect and analyze degeneracies and to adapt them to these cases.

A more adaptive approach involves linear algebra computation in a hierarchy of polynomial vector spaces. It produces a description of quotient algebra structures, from which the solutions of polynomial systems can be recovered. This family of methods includes Gröbner Basis, which provides general tools for solving polynomial equations. Border Basis is an alternative approach, offering numerically stable methods for solving polynomial equations with approximate coefficients. An important issue is to understand and control the numerical behavior of these methods as well as their complexity and to exploit the structure of the input system.

In order to compute “only” the (real) solutions of a polynomial system in a given domain, duality techniques can also be employed. They consist in analyzing and adding constraints on the space of linear forms which vanish on the polynomial equations. Combined with semi-definite programming techniques, they provide efficient methods to compute the real solutions of algebraic equations or to solve polynomial optimization problems. The main issues are the completness of the approach, their scalability with the degree and dimension and the certification of bounds.

Singular solutions of polynomial systems can be analyzed by computing differentials, which vanish at these points. This leads to efficient deflation techniques, which transform a singular solution of a given problem into a regular solution of the transformed problem. These local methods need to be combined with more global root localisation methods.

Subdivision methods are another type of methods which are interesting for robust geometric computation. They are based on exclusion tests which certify that no solution exists in a domain and inclusion tests, which certify the uniqueness of a solution in a domain. They have shown their strength in addressing many algebraic problems, such as isolating real roots of polynomial equations or computing the topology of algebraic curves and surfaces. The main issues in these approaches is to deal with singularities and degenerate solutions.

The main domain of applications that we consider for the methods we develop is Computer Aided Design and Manufacturing.

Computer-Aided Design (CAD) involves creating digital models defined by mathematical constructions, from geometric, functional or aesthetic considerations. Computer-aided manufacturing (CAM) uses the geometrical design data to control the tools and processes, which lead to the production of real objects from their numerical descriptions.

CAD-CAM systems provide tools for visualizing, understanding, manipulating, and editing virtual shapes. They are extensively used in many applications, including automotive, shipbuilding, aerospace industries, industrial and architectural design, prosthetics, and many more. They are also widely used to produce computer animation for special effects in movies, advertising and technical manuals, or for digital content creation. Their economic importance is enormous. Their importance in education is also growing, as they are more and more used in schools and educational purposes.

CAD-CAM has been a major driving force for research developments in geometric modeling, which leads to very large software, produced and sold by big companies, capable of assisting engineers in all the steps from design to manufacturing.

Nevertheless, many challenges still need to be addressed. Many problems remain open, related to the use of efficient shape representations, of geometric models specific to some application domains, such as in architecture, naval engineering, mechanical constructions, manufacturing ...Important questions on the robustness and the certification of geometric computation are not yet answered. The complexity of the models which are used nowadays also appeals for the development of new approaches. The manufacturing environment is also increasingly complex, with new type of machine tools including: turning, 5-axes machining and wire EDM (Electrical Discharge Machining), 3D printer. It cannot be properly used without computer assistance, which raises methodological and algorithmic questions. There is an increasing need to combine design and simulation, for analyzing the physical behavior of a model and for optimal design.

The field has deeply changed over the last decades, with the emergence of new geometric modeling tools built on dedicated packages, which are mixing different scientific areas to address specific applications. It is providing new opportunities to apply new geometric modeling methods, output from research activities.

A major bottleneck in the CAD-CAM developments is the lack of interoperability of modeling systems and simulation systems. This is strongly influenced by their development history, as they have been following different paths.

The geometric tools have evolved from supporting a limited number of tasks at separate stages in product development and manufacturing, to being essential in all phases from initial design through manufacturing.

Current Finite Element Analysis (FEA) technology was already well
established 40 years ago, when CAD-systems just started to
appear, and its success stems from using approximations of both the
geometry and the analysis model with low order finite elements (most
often of degree

There has been no requirement between CAD and numerical simulation, based on Finite Element Analysis, leading to incompatible mathematical representations in CAD and FEA. This incompatibility makes interoperability of CAD/CAM and FEA very challenging. In the general case today, this challenge is addressed by expensive and time-consuming human intervention and software developments.

Improving this interaction by using adequate geometric and functional descriptions should boost the interaction between numerical analysis and geometric modeling, with important implications in shape optimization. In particular, it could provide a better feedback of numerical simulations on the geometric model in a design optimization loop, which incorporates iterative analysis steps.

The situation is evolving. In the past decade, a new paradigm has emerged to replace the traditional Finite Elements by B-Spline basis element of any polynomial degree, thus in principle enabling exact representation of all shapes that can be modeled in CAD. It has been demonstrated that the so-called isogeometric analysis approach can be far more accurate than traditional FEA.

It opens new perspectives for the interoperability between geometric modeling and numerical simulation. The development of numerical methods of high order using a precise description of the shapes raises questions on piecewise polynomial elements, on the description of computational domains and of their interfaces, on the construction of good function spaces to approximate physical solutions. All these problems involve geometric considerations and are closely related to the theory of splines and to the geometric methods we are investigating. We plan to apply our work to the development of new interactions between geometric modeling and numerical solvers.

Bernard Mourrain was awarded Solid Modeling Fellow of the Solid Modeling Association in 2023.

G+Smo (pronounced gismo or gizmo) is a C++ library for isogeometric analysis (IGA).

G+Smo (Geometry + Simulation Modules, pronounced "gismo") is an open-source C++ library that brings together mathematical tools for geometric design and numerical simulation. It implements the relatively new paradigm of isogeometric analysis, which suggests the use of a unified framework in the design and analysis pipeline. G+Smo is an object-oriented, cross-platform, template C++ library and follows the generic programming principle, with a focus on both efficiency and ease of use. The library aims at providing access to high quality, open-source software to the forming isogeometric numerical simulation community and beyond. Geometry plus simulation modules aims at the seamless integration of Computer-aided Design (CAD) and high order Finite Element Analysis (FEA).

The library and its documentation are available at https://gismo.github.io/

The package provides efficient tools to build convex relaxations of moment sequences and their dual Sum-of-Squares relaxations, to optimize vectors of moment sequences that satisfy positivity constraints or mass constraints, to compute global minimizers of polynomial and moment optimization problems from moment sequences, polar ideals, approximate real radical. It also provides tools for computing minimum enclosing ellipsoids of basic semi-algebraic sets. It uses a connection with SDP solvers via the JuMP interface.

The package is available at https://github.com/AlgebraicGeometricModeling/MomentTools.jl and its documentation at https://algebraicgeometricmodeling.github.io/MomentTools/

TensorDec is a Julia package for the decomposition of tensors and polynomial-exponential series. It provides tools to compute rank decomposition or Waring decomposition of symmetric tensors or multivariate homogeneous, of multilinear tensors.

It also allows computing low rank tensor approximations of given tensors, using Riemannian optimization techniques, with well-chosen initial start. It also provides tools to compute catalecticant or Hankel operators associated to tensors and their apolar ideal.

The package is accessible at https://github.com/AlgebraicGeometricModeling/TensorDec.jl and its documentation at https://algebraicgeometricmodeling.github.io/TensorDec.jl/.

Polynomial optimization is a fascinating field of study that has revolutionized the way we approach nonlinear problems described by polynomial constraints. The applications of this field range from production planning processes to transportation, energy consumption, and resource control. This introductory book explores the latest research developments in polynomial optimization, presenting the results of cutting-edge interdisciplinary work conducted by the European network POEMA. For the past four years, experts from various fields, including algebraists, geometers, computer scientists, and industrial actors, have collaborated in this network to create new methods that go beyond traditional paradigms of mathematical optimization. By exploiting new advances in algebra and convex geometry, these innovative approaches have resulted in significant scientific and technological advancements. The book 31 aims to make these exciting developments accessible to a wider audience by gathering high-quality chapters on these hot topics. Aimed at both aspiring and established researchers, as well as industry professionals, this book will be an invaluable resource for anyone interested in polynomial optimization and its potential for real-world applications. The coeditors of the book are M. Kocvara and C. Riener from POEMA network.

In the chapter 32 of the book 31, together with S. Habibi and M. Kocvara from POEMA network, we review applications of Polynomial Optimization techniques to Geometric Modeling problems. We present examples of topical problems in Geometric Modeling, illustrate their solution using Polynomial Optimization Tools, report some experimental results and analyse the behavior of the methods, showing what are their strengths and their limitations.

The aim of the book 30, co-authored with Fabrizio Catanese (Bayreuth University) and Elisa Postinghel (Trento University) is manifold, it intends to overview the wide topic of algebraic curves and surfaces (also with a view to higher dimensional varieties) from different aspects: the historical development that led to the theory of algebraic surfaces and the classification theorem of algebraic surfaces by Castelnuovo and Enriques; the use of such a classical geometric approach, as the one introduced by Castelnuovo, to study linear systems of hypersurfaces; and the algebraic methods used to find implicit equations of parametrized algebraic curves and surfaces, ranging from classical elimination theory to more modern tools involving syzygy theory and Castelnuovo-Mumford regularity. Since our subject has a long and venerable history, this book cannot cover all the details of this broad topic, theory and applications, but it is meant to serve as a guide for both young mathematicians to approach the subject from a classical and yet computational perspective, and for experienced researchers as a valuable source for recent applications.

In 20 we present a new construction of basis functions that generate the space of geometrically smooth splines on an unstructured quadrilateral mesh. The basis is represented in terms of biquintic Bézier polynomials on each quadrilateral face. The gluing along the face boundaries is achieved using quadratic gluing data functions, leading to globally G1-smooth spaces. We analyze the latter space and provide a combinatorial formula for its dimension as well as an explicit basis construction. Moreover, we assess the use of this basis in point cloud fitting problems. To apply G1 least squares fitting, a quadrilateral structure as well as parameters in each quadrilateral is required. Even though the general problem of segmenting and parametrizing point clouds is beyond the focus of the present work, we describe a procedure that produces such a structure as well as patch-local parameters. Our experiments demonstrate the accuracy and smoothness of the obtained reconstructed models in several challenging instances.

Inspired by the strengths of quadric error metrics initially designed for mesh decimation, we proposed in 29 a concise mesh reconstruction approach for 3D point clouds. Our approach proceeds by clustering the input points enriched with quadric error metrics, where the generator of each cluster is the optimal 3D point for the sum of its quadric error metrics. This approach favors the placement of generators on sharp features, and tends to equidistribute the error among clusters. We reconstruct the output surface mesh from the adjacency between clusters and a constrained binary solver. We combine our clustering process with an adaptive refinement driven by the error. Compared to prior art, our method avoids dense reconstruction prior to simplification and produces immediately an optimized mesh.

In order to perform isogeometric analysis with increased smoothness on complex domains, trimming, variational coupling or unstructured spline methods can be used. The latter two classes of methods require a multi-patch segmentation of the domain, and provide continuous bases along patch interfaces. In the context of shell modeling, variational methods are widely used, whereas the application of unstructured spline methods on shell problems is rather scarce. In 22 we therefore provide a qualitative and a quantitative comparison of a selection of unstructured spline constructions, in particular the D-Patch, Almost-

Reconstruction of highly accurate CAD models from point clouds is both paramount and challenging in industries such as aviation. Due to the acquisition process, this kind of data can be scattered and affected by noise, yet the reconstructed geometric models are required to be compact and smooth, while simultaneously capturing key geometric features of the engine parts. In 23, we present an iterative moving parameterization approach, which consists of alternating steps of surface fitting, parameter correction, and adaptive refinement using truncated hierarchical B-splines (THB-splines). We revisit two existing surface fitting methods, a global least squares approximation and a hierarchical quasi-interpolation scheme, both based on THB-splines. At each step of the adaptive loop, we update the parameter locations by solving a non-linear optimization problem to infer footpoints of the point cloud on the current fitted surface. We compare the behavior of different optimization settings for the critical task of distance minimization, by also relating the effectiveness of the correction step to the quality of the initial parameterization. In addition, we apply the proposed approach in the reconstruction of aircraft engine components from scanned point data. It turns out that the use of moving parameterization instead of fixed parameter values, when suitably combined with the adaptive spline loop, can significantly improve the resulting surfaces, thus outperforming state-of-the-art hierarchical spline model reconstruction schemes.

The publication 24 proposes a deep learning approach for parameterizing an unorganized or scattered point cloud in R 3 with graph convolutional neural networks. It builds upon a graph convolutional neural network that predicts the weights (called parameterization weights) of certain convex combinations that lead to a mapping of the 3D points into a planar parameter domain. First, we compute a radius neighbors graph that yields proximity information to each 3D point in the cloud. This radius graph is then converted to its line graph, which encodes edge adjacencies, and is equipped with appropriate weights. The line graph is used as input to a graph convolutional neural network trained to predict optimal parameterizations. The proposed model outperforms closed-form choices of the parameterization weights and produces high quality parameterizations for surface reconstruction schemes.
In a similar spirit, in 28 we propose a dimension independent method based on convolutional neural networks to assign parameter values to gridded point clouds of arbitrary size, without the need for additional data processing steps. We train the proposed networks by considering polynomial least squares approximations and demonstrate, both in the univariate and bivariate settings, that the accuracy of the final model properly scales when uniform and adaptive spline refinement is considered. A selection of numerical experiments on point clouds of different sizes highlights the performance of our parameterization scheme. Noisy data sets which simulate measurement errors are also considered.

The Finite Element (FE) modeling of a tooth flank affects both the accuracy of the results as well as the computational time and resources required. In FE Analysis (FEA), the involute curve is modelled by linear segments, thus deviations from its actual geometry are introduced. In order to achieve accurate results, the mesh must be generated as dense as possible. This increases the number of degrees of freedom and the computational cost along with it. This inherent drawback of FEA unfavorably affects the calculation of the pressure distribution of the mating tooth flanks. Isogeometric Analysis (IGA) is a recent alternative to the FEA. It uses B-Splines, a technology used by Computer Aided Design (CAD) systems, to model the geometry as well as the solution field. Thus, no geometric error is introduced in the transition from CAD to analysis. An inherent characteristic of B-Splines is the continuity between adjacent elements. Furthermore, the smooth normal vector field of the surface is known at every point in the interior of elements. This is particularly advantageous for contact algorithms because this alleviates the need for special smoothing techniques such as those often used in FEA. In the study 25, a spur gear pair is simulated and the pressure at the contact area is calculated. The results are compared to those obtained by FEA in terms of both accuracy and computational cost.

In 15 we applied basic algorithms for volume computation of general-dimensional polytopes and more general convex bodies, defined by the intersection of a simplex by a family of parallel hyperplanes, and another family of parallel hyperplanes or a family of concentric ellipsoids. Such convex bodies appear in modeling and predicting financial crises. The impact of crises on the economy (labor, income, etc.) makes its detection of prime interest for the public in general and for policy makers in particular. Certain features of dependencies in the markets clearly identify times of turmoil. We describe the relationship between asset characteristics by means of a copula; each characteristic is either a linear or quadratic form of the portfolio components, hence the copula can be estimated by computing volumes of convex bodies. We design and implement practical algorithms in the exact and approximate setting, and experimentally juxtapose them in order to study the trade-off of exactness and accuracy for speed. We also experimentally find an efficient parameter-tuning to achieve a sufficiently good estimation of the probability density of each copula. Our C++ software, based on Eigen and available on github, is shown to be very effective in up to 100 dimensions. Our results offer novel, effective means of computing portfolio dependencies and an indicator of financial crises, which is shown to correctly identify past crises.

In 16 our most recent techniques tackle the problem of efficiently approximating the volume of convex polytopes, when these are given in 3 different representations: H-polytopes, which have been studied extensively, V-polytopes, and zonotopes (Z-polytopes). We design a novel practical Multiphase Monte Carlo algorithm that leverages random walks based on billiard trajectories, as well as a new empirical convergence test and a simulated annealing schedule of adaptive convex bodies. After tuning several parameters of our proposed method, we present a detailed experimental evaluation of our tuned algorithm using a rich dataset containing Birkhoff polytopes and polytopes from structural biology. Our open-source implementation tackles problems that have been intractable so far, offering the first software to scale up in thousands of dimensions for H-polytopes and in the hundreds for V- and Z-polytopes on moderate hardware. Last, we illustrate our software in evaluating Z-polytope approximations.

In 17 we apply random walk technology to systems biology. Metabolic networks and their reconstruction set a new era in the analysis of metabolic and growth functions in the various organisms. By modeling the reactions occurring inside an organism, metabolic networks provide the means to understand the underlying mechanisms that govern biological systems. Constraint-based approaches have been widely used for the analysis of such models and led to intriguing geometry-oriented challenges. In this setting, sampling uniformly points from polytopes derived from metabolic models (flux sampling) provides a representation of the solution space of the model under various conditions. However, the polytopes that result from such models are of high dimension (in the order of thousands) and usually considerably skinny. Therefore, to sample uniformly at random from such polytopes shouts for a novel algorithmic and computational framework specially tailored for the properties of metabolic models. We present a complete software framework to handle sampling in metabolic networks. Its backbone is a Multiphase Monte Carlo Sampling (MMCS) algorithm that unifies rounding and sampling in one pass, yielding both upon termination. It exploits an optimized variant of the Billiard Walk that enjoys faster arithmetic complexity per step than the original. We demonstrate the efficiency of our approach by performing extensive experiments on various metabolic networks. Notably, sampling on the most complicated human metabolic network accessible today, Recon3D, corresponding to a polytope of dimension 5335, took less than 30 hours. To the best of our knowledge, that is out of reach for existing software.

Impressive progress in generative models and implicit representations gave rise to methods that can generate 3D shapes of high quality. However, being able to locally control and edit shapes is another essential property that can unlock several content creation applications. Local control can be achieved with part-aware models, but existing methods require 3D supervision and cannot produce textures. In 27, we devise PartNeRF, a novel part-aware generative model for editable 3D shape synthesis that does not require any explicit 3D supervision. Our model generates objects as a set of locally defined NeRFs, augmented with an affine transformation. This enables several editing operations such as applying transformations on parts, mixing parts from different objects etc. To ensure distinct, manipulable parts we enforce a hard assignment of rays to parts that makes sure that the color of each ray is only determined by a single NeRF. As a result, altering one part does not affect the appearance of the others. Evaluations on various ShapeNet categories demonstrate the ability of our model to generate editable 3D objects of improved fidelity, compared to previous part-based generative approaches that require 3D supervision or models relying on NeRFs.

In 26, we present Fairness Aware Counterfactuals for Subgroups (FACTS), a framework for auditing subgroup fairness through counterfactual explanations. We start with revisiting (and generalizing) existing notions and introducing new, more refined notions of subgroup fairness. We aim to (a) formulate different aspects of the difficulty of individuals in certain subgroups to achieve recourse, i.e. receive the desired outcome, either at the micro level, considering members of the subgroup individually, or at the macro level, considering the subgroup as a whole, and (b) introduce notions of subgroup fairness that are robust, if not totally oblivious, to the cost of achieving recourse. We accompany these notions with an efficient, model-agnostic, highly parameterizable, and explainable framework for evaluating subgroup fairness. We demonstrate the advantages, the wide applicability, and the efficiency of our approach through a thorough experimental evaluation of different benchmark datasets.

The main goal of the athletes who undergo anterior cruciate ligament reconstruction (ACLR) surgery is a successful return-to-sport. At this stage, identifying muscular deficits becomes important. Hence, in 18, we study three discriminative features based on surface electromyographic signals (sEMG) acquired in a dynamic protocol are introduced to assess the damping ability and interpret activation patterns in lower-limb muscles of ACLR athletes. Methods: The features include the median frequency of the power spectrum density (PSD), the relative percentage of the equivalent damping or equivalent stiffness derived from the median frequency, and the energy of the signals in the time-frequency plane of the pseudo-Wigner-Ville distribution (PWVD). To evaluate the features, 11 healthy and 11 ACLR athletes (6 months post-reconstruction surgery) were recruited to acquire the sEMG signals from the medial and the lateral parts of the hamstrings, quadriceps, and gastrocnemius muscles in pre- and post-fatigue single-leg landings. Results: A significant damping deficiency is observed in the hamstring muscles of ACLR athletes by evaluating the proposed features. This deficiency indicates that more attention should be paid to this muscle of ACLR athletes in pre-return-to-sport rehabilitations. Conclusion: The quality of electromyography-based pre-return-to-sport assessments on ACLR subjects depends on the sEMG acquisition protocol, as well as the type and nature of the extracted features. Hence, combinatorial application of both energy-based features (derived from the PWVD) and power-based features (derived from the PSD) could facilitate the assessment process by providing additional biomechanical information regarding the behavior of the muscles surrounding the knee.

Ioannis Emiris coordinates a research contract with the industrial partner ANSYS Inc. (Greece), in collaboration with Athena Research Center. MSc students P. Repouskos, M. Dioletis, and T. Pappas, postdoc fellow I. Psarros, and Athena researcher George Ioannakis are partially funded.

Electronic design automation (EDA) and simulating Integrated Circuits requires robust geometric operations on thousands of electronic elements (capacitors, resistors, coils etc) represented by polyhedral objects in 2.5 dimensions, not necessarily convex. A special case may concern axis-aligned objects but the real challenge is the general case. The project, extended into 2024, focuses on 3 axes: (1) efficient data structures and prototype implementations for storing the aforementioned polyhedral objects so that nearest neighbor queries are fast in the L-max metric, which is the primary focus of the contract, (2) random sampling of the free space among objects, (3) data-driven algorithmic design for problems concerning data-structures and their construction and initialization. The implementation of prototypes has led into the development of a software library including implementations of parallel algorithms (Cuda).

It has continued into 2023 along with a tripartite grant from the Greek ministry of Development (Athena RC, ANSYS, and University of Patras, Greece).

CIFRE collaboration between Schlumberger Montpellier (A. Azzedine) and Inria Sophia Antipolis (B. Mourrain). The PhD candidate is A. Belhachmi. The objective of the work is the development of a new spline based high quality geomodeler for reconstructing the stratigraphy of geological layers from the adaptive and efficient processing of large terrain information.

OpenReality is a startup company that is developing curved photonic
optics to enable high-fidelity Augmented Reality solutions.
In the frame of a consulting contract we collaborated on
the development of 3D curved photonic surfaces
represented by splines.

Collaboration with the Universiry of Florence.
Efficient high order learning for deep geometric design networks.
The research program focuses on the interaction between the computational side of geometric models, and their application-oriented side devoted to the design and analysis of efficient adaptive spline approximation schemes.
The work plan addresses important challenges in the
area of geometric modeling and processing, creating a connection with suitable machine learning
applications.

POEMA project on cordis.europa.eu

Non-linear optimization problems are present in many real-life applications and in scientific areas such as operations research, control engineering, physics, information processing, economy, biology, etc. However, efficient computational procedures, that can provide the guaranteed global optimum, are lacking for them. The project will develop new polynomial optimization methods, combining moment relaxation procedures with computational algebraic tools to address this type of problems. Recent advances in mathematical programming have shown that the polynomial optimization problems can be approximated by sequences of Semi-Definite Programming problems. This approach provides a powerful way to compute global solutions of non-linear optimization problems and to guarantee the quality of computational results. On the other hand, advanced algebraic algorithms to compute all the solutions of polynomial systems, with efficient implementations for exact and approximate solutions, were developed in the past twenty years.

The network combines the expertise of active European teams working in these two domains to address important challenges in polynomial optimization and to show the impact of this research on practical applications. The network will train a new squad of 15 young researchers to master high-level mathematics, algorithm design, scientific computation and software development, and to solve optimization problems for real-world applications. It will advance the research on algebraic methods for moment approaches, tackle mixed integer non-linear optimization problems and enhance the efficiency and robustness of moment relaxation methods. Specific applications of these approaches to optimization problems are related to smarter cities challenges, such as water distribution network management, energy flow in power systems, urban traffic management, as well as to oceanography and environmental monitoring and finance.

GRAPES project on cordis.europa.eu

GRAPES aims at considerably advancing the state of the art in Mathematics, Computer-Aided Design, and Machine Learning in order to promote game changing approaches for generating, optimizing, and learning 3D shapes, along with a multisectoral training for young researchers. Recent advances in the above domains have solved numerous tasks concerning multimedia and 2D data. However, automation of 3D geometry processing and analysis lags severely behind, despite their importance in science, technology and everyday life, and the well-understood underlying mathematical principles. The CAD industry, although well established for more than 20 years, urgently requires advanced methods and tools for addressing new challenges.

The scientific goal of GRAPES is to bridge this gap based on a multidisciplinary consortium composed of leaders in their respective fields. Top-notch research is also instrumental in forming the new generation of European scientists and engineers. Their disciplines span the spectrum from Computational Mathematics, Numerical Analysis, and Algorithm Design, up to Geometric Modelling, Shape Optimisation, and Deep Learning. This allows the 15 PhD candidates to follow either a theoretical or an applied track and to gain knowledge from both research and innovation through a nexus of intersectoral secondments and Network-wide workshops.

Horizontally, our results lead to open-source, prototype implementations, software integrated into commercial libraries as well as open benchmark datasets. These are indispensable for dissemination and training but also to promote innovation and technology transfer. Innovation relies on the active participation of SMEs, either as a beneficiary hosting an ESR or as associate partners hosting secondments. Concrete applications include simulation and fabrication, hydrodynamics and marine design, manufacturing and 3D printing, retrieval and mining, reconstruction and visualisation, urban planning and autonomous driving.

Angelos Mantzaflaris co-organized with Felix Scholz the mini-symposium “Deep learning for geometric design” at the SIAM Conference on Computational Geometric Design (GD23), Genova, Italy, July 3-7, 2023.

Angelos Mantzaflaris co-organized with Victor Calo, Pablo Antolin and Mattia Tani the mini-symposium “Fast formation and solution techniques for large-scale IGA” 11th International Conference on Isogeometric Analysis (IGA 2023) in Lyon, France, on 18-21 June 2023.

Angelos Mantzaflaris co-organized with Mattia Tani, John Evans and Stefan Takacs the mini-symposium “Efficient methods for Isogeometric Analysis” at the 10th International Congress on Industrial and Applied Mathematics (ICIAM), Tokyo, Japan, August 20-24, 2023.

Bernard Mourrain was

Evelyne Hubert was requested for

Laurent Busé was

Angelos Mantzaflaris is a member of the Bureau of AMIES (Agence pour les Mathématiques en Interaction avec l'Entreprise et la Société) and member of the Comité du Centre of Inria d'Université Côte d'Azur in 2023.

Bernard Mourrain is member of the Bureau du Comité des Equipes Projets (BCEP).

Laurent Busé is co-chair, with Clément Pernet, of the french computer algebra research group of CNRS (GT Calcul Formel du GDR Informatique-Mathématiques)

Bernard Mourrain was

Evelyne Hubert

Laurent Busé