Section: Overall Objectives
Overview
Computer Arithmetic studies how a machine may deal with numbers. This is a wide field with many aspects: from the mathematics related to numbers, their representation, and operations on them, to the technologies used to build the machine, and through the algorithms related to number processing. In addition, there are many different types of numbers (mostly integers, reals, complex numbers, and finite fields), many operations defined by algebra over these number sets, and many possible machine representations of these numbers. Some of these representations are only approximate, which raises safety issues. Finally, number processing takes place in the context of applications which define constraints or costs that have to be optimized.
The overall objective of AriC is, through computer arithmetic, to improve computing at large, in terms of performance, efficiency, and reliability.
This requires to master the broad range of expertises listed above. The AriC project addresses this challenge in breadth, spanning computer arithmetic along three structural axes:
from the high-level specification of a computation to the lower-level details of its implementation,
reconciling performance and numerical quality, both when building operators and when using existing operators,
developing the mathematical and algorithmic foundations of computing.
More than being research directions themselves, these three axes structure the links between our individual research directions.
This in-breadth approach to computer arithmetic is the very specificity of the AriC project, and its main strength. Other computer arithmetic teams have a much narrower focus (e.g., hardware arithmetic, or floating-point algorithms, or arithmetic for cryptography, or formal proof of computer arithmetic, etc.). Actually, most members of the computer arithmetic community belong to teams that do not focus on computer arithmetic.
With respect to computing at large, our originality is the computer arithmetic focus. We believe that a deep understanding of the arithmetic of machine numbers (taken for what they are, not only as approximate integers or real numbers), is critical to address many challenges of numerics and computing, from reliability (e.g., avoiding overflow or critical loss of precision) to performance.