Members
Overall Objectives
Research Program
Application Domains
Software and Platforms
New Results
Bilateral Contracts and Grants with Industry
Partnerships and Cooperations
Dissemination
Bibliography
XML PDF e-pub
PDF e-Pub


Section: Overall Objectives

Overall Objectives

Many phenomena of interest are analyzed and controlled through graphs or n-dimensional images. Often, these graphs have an irregular aspect, whether the studied phenomenon is of natural or artificial origin. In the first class, one may cite natural landscapes, most biological signals and images (EEG, ECG, MR images, ...), and temperature records. In the second class, prominent examples include financial logs and TCP traces.

Such irregular phenomena are usually not adequately described by purely deterministic models, and a probabilistic ingredient is often added. Stochastic processes allow to take into account, with a firm theoretical basis, the numerous microscopic fluctuations that shape the phenomenon.

In general, it is a wrong view to believe that irregularity appears as an epiphenomenon, that is conveniently dealt with by introducing randomness. In many situations, and in particular in some of the examples mentioned above, irregularity is a core ingredient that cannot be removed without destroying the phenomenon itself. In some cases, irregularity is even a necessary condition for proper functioning. A striking example is that of ECG: an ECG is inherently irregular, and, moreover, in a mathematically precise sense, an increase in its regularity is strongly correlated with a degradation of its condition.

In fact, in various situations, irregularity is a crucial feature that can be used to assess the behaviour of a given system. For instance, irregularity may the result of two or more sub-systems that act in a concurrent way to achieve some kind of equilibrium. Examples of this abound in nature (e.g. the sympathetic and parasympathetic systems in the regulation of the heart). For artifacts, such as financial logs and TCP traffic, irregularity is in a sense an unwanted feature, since it typically makes regulations more complex. It is again, however, a necessary one. For instance, efficiency in financial markets requires a constant flow of information among agents, which manifests itself through permanent fluctuations of the prices: irregularity just reflects the evolution of this information.

The aim of Regularity is a to develop a coherent set of methods allowing to model such “essentially irregular” phenomena in view of managing the uncertainties entailed by their irregularity.

Indeed, essential irregularity makes it more to difficult to study phenomena in terms of their description, modeling, prediction and control. It introduces uncertainties both in the measurements and the dynamics. It is, for instance, obviously easier to predict the short time behaviour of a smooth (e.g. C1) process than of a nowhere differentiable one. Likewise, sampling rough functions yields less precise information than regular ones. As a consequence, when dealing with essentially irregular phenomena, uncertainties are fundamental in the sense that one cannot hope to remove them by a more careful analysis or a more adequate modeling. The study of such phenomena then requires to develop specific approaches allowing to manage in an efficient way these inherent uncertainties.