Many phenomena of interest are analyzed and controlled
through graphs or n-dimensional images. Often, these graphs have
an *irregular aspect*, whether the studied phenomenon is of natural
or artificial origin. In the first class, one may cite
natural landscapes, most biological signals and images (EEG, ECG, MR images, ...),
and temperature records. In the second class, prominent examples include financial logs and TCP traces.

Such irregular phenomena are usually not adequately described by purely deterministic models, and a probabilistic ingredient is often added. Stochastic processes allow to take into account, with a firm theoretical basis, the numerous microscopic fluctuations that shape the phenomenon.

In general, it is a wrong view to believe that
irregularity appears as an epiphenomenon, that is
conveniently dealt with by introducing randomness. In many situations, and
in particular in some of the examples
mentioned above, irregularity is a core
ingredient that cannot be removed without destroying the
phenomenon itself. In some cases, irregularity is even a
necessary condition for proper functioning.
A striking example is that of ECG: an ECG is inherently irregular, and, moreover, in a mathematically precise
sense, an *increase* in its regularity is strongly correlated with a *degradation* of its condition.

In fact, in various situations, irregularity is a crucial feature that can be used
to assess the behaviour of a given system. For instance,
irregularity may the result of two or more sub-systems that
act in a concurrent way to achieve some kind of equilibrium.
Examples of this abound in nature
(*e.g.* the sympathetic and parasympathetic systems in the regulation of the heart). For artifacts, such as financial logs and TCP traffic, irregularity is in a sense
an unwanted feature, since it typically makes regulations more complex. It is
again, however, a necessary one. For instance, efficiency in financial markets requires a constant flow of information among agents, which manifests itself
through permanent fluctuations of the prices: irregularity just reflects the evolution of this information.

The aim of *Regularity* is a to develop a coherent set of methods allowing to model such “essentially
irregular” phenomena in view of managing the uncertainties entailed by their irregularity.

Indeed, essential irregularity makes it more to difficult to study phenomena in terms of their description,
modeling, prediction and control. It introduces *uncertainties* both in
the measurements and the dynamics. It is, for instance, obviously easier to predict the short
time behaviour of a smooth (*e.g.*

Release of version 2.1 of the software toolbox FracLab.

The modeling of essentially irregular phenomena is an important challenge, with an emphasis on understanding the sources and functions of this irregularity. Probabilistic tools are well-adapted to this task, provided one can design stochastic models for which the regularity can be measured and controlled precisely. Two points deserve special attention:

first, the study of regularity has to be *local*. Indeed, in most applications, one will want to act on a system based on local temporal or spatial information. For instance, detection of arrhythmias in ECG
or of krachs in financial markets should be performed in “real time”, or, even better, ahead of time. In this sense, regularity is a *local* indicator of the *local* health of a system.

Second, although we have used the term “irregularity” in a generic and somewhat vague sense, it seems obvious that, in real-world phenomena, regularity comes in many colors, and a rigorous analysis should distinguish between them. As an example, at least two kinds of irregularities are present in financial logs: the local “roughness” of the records, and the local density and height of jumps. These correspond to two different concepts of regularity (in technical terms, Hölder exponents and local index of stability), and they both contribute a different manner to financial risk.

In view of the above, the *Regularity* team focuses on the design of methods that:

define and study precisely various relevant measures of local regularity,

allow to build stochastic models versatile enough to mimic the rapid variations of the different kinds of regularities observed in real phenomena,

allow to estimate as precisely and rapidly as possible these regularities, so as to alert systems in charge of control.

Our aim is to address the three items above through the design of mathematical tools in the field of probability (and, to a lesser extent, statistics), and to apply these tools to uncertainty management as described in the following section. We note here that we do not intend to address the problem of controlling the phenomena based on regularity, that would naturally constitute an item 4 in the list above. Indeed, while we strongly believe that generic tools may be designed to measure and model regularity, and that these tools may be used to analyze real-world applications, in particular in the field of uncertainty management, it is clear that, when it comes to control, application-specific tools are required, that we do not wish to address.

The research topics of the *Regularity* team can be roughly divided into two strongly interacting axes, corresponding to two complementary ways of studying regularity:

developments of tools allowing to characterize, measure and estimate various notions of local regularity, with a particular emphasis on the stochastic frame,

definition and fine analysis of stochastic models for which some aspects of local regularity may be prescribed.

These two aspects are detailed in sections and below.

**Fractional Dimensions**

Although the main focus of our team is on characterizing *local*
regularity, on occasions, it is interesting to use a *global*
index of regularity. Fractional dimensions provide such an index.
In particular, the *regularization dimension*, that was defined
in , is well adapted to the study stochastic processes, as its
definition allows to build robust estimators in an easy way.
Since its introduction, regularization dimension has been used by various teams
worldwide in many different applications including the characterization of certain stochastic
processes, statistical estimation,
the study of mammographies or galactograms for breast
carcinomas detection,
ECG analysis for the study of ventricular arrhythmia,
encephalitis diagnosis from EEG, human skin analysis,
discrimination between the nature of radioactive contaminations,
analysis of porous media textures,
well-logs data analysis,
agro-alimentary image analysis, road profile analysis, remote sensing,
mechanical systems assessment, analysis of video games, ...(see http://

**Hölder exponents**

The simplest and most popular measures of local
regularity are the pointwise
and local Hölder exponents. For a stochastic process

and

Although these quantities are in general random, we will omit as is customary
the dependency in

The random functions

The pointwise Hölder exponent is a very versatile
tool, in the sense that the set of pointwise Hölder functions of
continuous functions is quite large (it coincides with the set of
lower limits of sequences of continuous functions ). In this sense,
the pointwise exponent is often a more precise tool
(*i.e.* it varies in a more rapid way)
than the local one, since local Hölder functions are always lower semi-continuous.
This is why, in particular, it is
the exponent that is used as a basis ingredient in multifractal
analysis (see section ). For certain classes of stochastic
processes, and most notably Gaussian processes, it has the remarkable
property that, at each point, it assumes an almost sure value .
SRP, mBm, and processes of this kind (see sections and
) rely on the sole use
of the pointwise Hölder exponent for prescribing the regularity.

However,

Another, related, drawback of the pointwise exponent is that it is
not stable under integro-differentiation, which sometimes makes
its use complicated in applications. Again, the local exponent provides
here a useful complement to

Both exponents have proved useful in various applications, ranging from image denoising and segmentation to TCP traffic characterization. Applications require precise estimation of these exponents.

**Stochastic 2-microlocal analysis**

Neither the pointwise nor the local exponents give a complete characterization of the local regularity, and, although their joint use somewhat improves the situation, it is far from yielding the complete picture.

A fuller description of local regularity is provided by the
so-called *2-microlocal analysis*, introduced by J.M. Bony
. In this frame, regularity
at each point is now specified by two indices, which makes the analysis
and estimation tasks more difficult. More precisely,
a function *2-microlocal space*

for all *2-microlocal
spectrum*. This spectrum provide a wealth of information on the local
regularity.

In , we have laid some foundations for a stochastic version of 2-microlocal analysis. We believe this will provide a fine analysis of the local regularity of random processes in a direction different from the one detailed for instance in .We have defined random versions of the 2-microlocal spaces, and given almost sure conditions for continuous processes to belong to such spaces. More precise results have also been obtained for Gaussian processes. A preliminary investigation of the 2-microlocal behaviour of Wiener integrals has been performed.

**Multifractal analysis of stochastic processes**

A direct use of the local regularity is often fruitful in applications.
This is for instance the case in RR analysis or terrain
modeling. However, in some situations,
it is interesting to supplement or replace it by a more global
approach known as *multifractal analysis* (MA). The idea behind
MA is to group together all points with same regularity (as measured
by the pointwise Hölder exponent) and to measure the “size” of
the sets thus obtained , , . There are mainly two ways to do so, a geometrical
and a statistical one.

In the geometrical approach, one defines the
*Hausdorff multifractal spectrum* of a process or function

The statistical path to MA is based on the so-called
*large deviation multifractal spectrum*:

where:

and *i.e.*:

Here,

The large deviation spectrum is typically easier to compute and to estimate than the Hausdorff one. In addition, it often gives more relevant information in applications.

Under very mild conditions (*e.g.* for instance, if
the support of *Legendre multifractal spectrum*. To do so,
one basically interprets the spectrum

with the convention

The Legendre multifractal spectrum of

To see the relation between

where *weak multifractal
formalism* holds, *i.e.* *strong
multifractal formalism*.

Multifractal spectra subsume a lot of information about the distribution of the regularity, that has proved useful in various situations. A most notable example is the strong correlation reported recently in several works between the narrowing of the multifractal spectrum of ECG and certain pathologies of the heart , . Let us also mention the multifractality of TCP traffic, that has been both observed experimentally and proved on simplified models of TCP , .

**Another colour in local regularity: jumps**

As noted above, apart from Hölder exponents and their generalizations,
at least another type of irregularity may sometimes be observed on
certain real phenomena: discontinuities, which occur for instance
on financial logs and certain biomedical signals. In this frame, it is of
interest to supplement Hölder exponents and their extensions with (at least) an additional
index that measures the local intensity and size of jumps. This is a topic we
intend to pursue in full generality in the near future. So far, we have developed an approach
in the particular frame of *multistable processes*. We refer to section
for more details.

The second axis in the theoretical developments of the *Regularity* team aims at defining and studying stochastic processes for which various aspects of the local regularity may be prescribed.

**Multifractional Brownian motion**

One of the simplest stochastic process for which some kind of control over the Hölder exponents is possible is probably fractional Brownian motion (fBm). This process was defined by Kolmogorov and further studied by Mandelbrot and Van Ness, followed by many authors. The so-called “moving average” definition of fBm reads as follows:

where

Although varying

It is possible to generalize fBm to obtain a Gaussian process for which the pointwise Hölder exponent
may be tuned at each point: the *multifractional Brownian motion (mBm)* is such
an extension, obtained by substituting the constant parameter *regularity function*

mBm was introduced independently by two groups of authors:
on the one hand, Peltier and Levy-Vehel defined the mBm

On the other hand, Benassi, Jaffard and Roux defined the mBm from the harmonizable representation of the
fBm, *i.e.*:

where

The Hölder exponents of the mBm are prescribed almost surely:
the pointwise Hölder exponent is

The fact that the local regularity of mBm
may be tuned *via* a functional parameter has made it a useful
model in various areas such as finance, biomedicine,
geophysics, image analysis, ....
A large number of studies have been devoted worldwide to its mathematical properties,
including in particular its local time. In addition,
there is now a rather strong body of work dealing the estimation of its
functional parameter, *i.e.* its local regularity. See http://

**Self-regulating processes**

We have recently introduced another class of stochastic models, inspired by mBm,
but where the local regularity, instead of being tuned “exogenously”, is
a function of the amplitude. In other words, at each point *self-regulating process* (SRP).
The particular process obtained by adapting adequately mBm is called
the self-regulating multifractional process . Another instance is given by
modifying the Lévy construction of Brownian motion .
The motivation for introducing self-regulating processes is based on the following general fact: in nature, the local regularity of a phenomenon is often related to its amplitude.
An intuitive example is provided by natural terrains: in young mountains, regions
at higher altitudes are typically more irregular than regions at lower altitudes.
We have verified this fact experimentally on several digital elevation models
. Other natural phenomena displaying a relation between
amplitude and exponent include temperatures
records and RR intervals extracted from ECG .

To build the SRMP, one starts from a field of fractional Brownian motions

the affine rescaling between

where

An example of a two dimensional SRMP with function

We believe that SRP open a whole new and very promising area of research.

**Multistable processes**

Non-continuous phenomena are commonly encountered in real-world
applications, *e.g.* financial records or EEG traces.
For such processes, the information brought
by the Hölder exponent must be supplemented by some measure of
the density and size of jumps. Stochastic processes with jumps,
and in particular Lévy processes, are currently an active area of research.

The simplest class of non-continuous Lévy processes is maybe the one
of stable processes . These are mainly characterized by a parameter
*stability index* (

In line with our quest for the characterization and modeling of
various notions of local regularity, we have defined *multistable processes*.
These are processes which are
“locally” stable, but where
the stability index

More formally, a multistable process is a process which is,
at each time

where the limit is understood either in finite dimensional
distributions or in the stronger sense of distributions.
Note

One approach to defining multistable processes is similar to the one
developed for constructing mBm : we consider fields of stochastic processes

A particular class of multistable processes, termed
“linear multistable multifractional
motions” (lmmm) takes the following form , .
Let

where

In fact, lmmm are somewhat more general than said above:
indeed, the couple

**Multiparameter processes**

In order to use stochastic processes to represent the variability of multidimensional phenomena, it is necessary to define extensions for indices in

These works have highlighted the difficulty of giving satisfactory definitions for increment stationarity, Hölder continuity and covariance structure which are not closely dependent on the structure of

A promising improvement in the definition of multiparameter extensions is the concept of *set-indexed processes*. A set-indexed process is a process whose indices are no longer “times” or “locations” but may be some compact connected subsets of a metric measure space. In the simplest case, this framework is a generalization of the classical multiparameter processes : usual multiparameter processes are set-indexed processes where the indexing subsets are simply the rectangles

Set-indexed processes allow for greater flexibility, and should in particular be useful for the modeling of censored data. This situation occurs frequently in biology and medicine, since, for instance, data may not be constantly monitored. Censored data also appear in natural terrain modeling when data are acquired from sensors in presence of hidden areas. In these contexts, set-indexed models should constitute a relevant frame.

A set-indexed extension of fBm is the first step toward the modeling of
irregular phenomena within this more general frame. In , the so-called *set-indexed fractional Brownian motion (sifBm)* was defined as the mean-zero Gaussian process

where

This process appears to be the only set-indexed process whose projection on increasing paths is a one-parameter fractional Brownian motion .
The construction also provides a way to define fBm's extensions on non-euclidean spaces, *e.g.* indices can belong to the unit hyper-sphere of

In the specific case of the indexing collection *multiparameter fractional Brownian motion (MpfBm)*. This process differs from the Lévy fractional Brownian motion and the fractional Brownian sheet, which are also multiparameter extensions of fBm (but do not derive from set-indexed processes).
The local behaviour of the sample paths of the MpfBm has been studied in . The self-similarity index

The increment stationarity property for set-indexed processes, previously defined in the study of the sifBm, allows to consider set-indexed processes whose increments are independent and stationary. This generalizes the definition of Bass-Pyke and Adler-Feigin for Lévy processes indexed by subsets of

Our theoretical works are motivated by and find natural applications to real-world problems in a general frame generally referred to as uncertainty management, that we describe now.

Since a few decades, modeling has gained an increasing part in complex systems design in various fields of industry such as automobile, aeronautics, energy, etc. Industrial design involves several levels of modeling: from behavioural models in preliminary design to finite-elements models aiming at representing sharply physical phenomena. Nowadays, the fundamental challenge of numerical simulation is in designing physical systems while saving the experimentation steps.

As an example, at the early stage of conception in aeronautics, numerical simulation aims at exploring the design parameters space and setting the global variables such that target performances are satisfied. This iterative procedure needs fast multiphysical models. These simplified models are usually calibrated using high-fidelity models or experiments. At each of these levels, modeling requires control of uncertainties due to simplifications of models, numerical errors, data imprecisions, variability of surrounding conditions, etc.

One dilemma in the design by numerical simulation is that many crucial choices are made very early, and thus when uncertainties are maximum, and that these choices have a fundamental impact on the final performances.

Classically, coping with this variability is achieved through *model registration* by experimenting and adding fixed *margins* to the model response.
In view of technical and economical performance, it appears judicious to replace these fixed margins by a rigorous analysis and control of risk. This may be achieved through a probabilistic approach to uncertainties, that provides decision criteria adapted to the management
of unpredictability inherent to design issues.

From the particular case of aircraft design emerge several general aspects of management of uncertainties in simulation. Probabilistic decision criteria, that translate decision making into mathematical/probabilistic terms, require the following three steps to be considered :

build a probabilistic description of the fluctuations of the model's parameters (*Quantification* of uncertainty sources),

deduce the implication of these distribution laws on the model's response (*Propagation* of uncertainties),

and determine the specific influence of each uncertainty source on the model's response variability (*Sensitivity Analysis*).

The previous analysis now constitutes the framework of a general study of uncertainties. It is used in industrial contexts where uncertainties can be represented by *random variables* (unknown temperature of an external surface, physical quantities of a given material, ... at a given *fixed time*). However, in order for the numerical models to describe with high fidelity a phenomenon, the relevant uncertainties must generally depend on time or space variables.
Consequently, one has to tackle the following issues:

*How to capture the distribution law of time (or space) dependent parameters,
without directly accessible data?*
The distribution of probability of the continuous time (or space) uncertainty sources must describe the links between variations at neighbor times (or points).
The local and global regularity are important parameters of these laws, since it describes how the fluctuations at some time (or point) induce fluctuations at close times (or points).
The continuous equations representing the studied phenomena should help *to propose models for the law of the random fields*.
Let us notice that interactions between various levels of modeling might also be used to derive distributions of probability at the lowest one.

The navigation between the various natures of models needs a kind of *metric* which could *mathematically describe the notion of granularity or fineness* of the models.
Of course, the local regularity will not be totally absent of this mathematical definition.

All the various levels of conception, preliminary design or high-fidelity modelling, require *registrations by experimentation* to reduce model errors.
This *calibration* issue has been present in this frame since a long time, especially in a deterministic optimization context. The random modeling of uncertainty requires the definition of a systematic approach.
The difficulty in this specific context is: statistical estimation with few data and estimation of a function with continuous variables using only discrete setting of values.

Moreover, a multi-physical context must be added to these questions. The complex system design is most often located at the interface between several disciplines. In that case, modeling relies on a coupling between several models for the various phenomena and design becomes a *multidisciplinary optimization* problem. In this uncertainty context, the real challenge turns robust optimization to manage technical and economical risks (risk for non-satisfaction of technical specifications, cost control).

We participate in the uncertainties community through several collaborative research projects (ANR and Pôle SYSTEM@TIC), and also through our involvement in the MASCOT-NUM research group (GDR of CNRS). In addition, we are considering probabilistic models as phenomenological models to cope with uncertainties in the DIGITEO ANIFRAC project. As explained above, we focus on essentially irregular phenomena, for which irregularity is a relevant quantity to capture the variability (e.g. certain biomedical signals, terrain modeling, financial data, etc.). These will be modeled through stochastic processes with prescribed regularity.

The design of a complex (mechanical) system such as aircraft, automobile or nuclear plant involves numerical simulation of several interacting physical phenomena: CFD and structural dynamics, thermal evolution of a fluid circulation, ... For instance, they can represent the resolution of coupled partial differential equations using finite element method. In the framework of uncertainty treatment, the studied “phenomenological model" is a chaining of different models representing the various involved physical phenomena. As an example, the pressure field on an aircraft wing is the result of both aerodynamic and structural mechanical phenomena. Let us consider the particular case of two models of partial differential equations coupled by limit conditions. The direct propagation of uncertainties is impossible since it requires an exploration and then, many calls to costly models. As a solution, engineers use to build reduced-order models: the complex high-fidelity model is substituted with a CPU less costly model. The uncertainty propagation is then realized through the simplified model, taking into account the approximation error (see ).

Interactions between the various models are usually explicited at the finest level (cf. Fig. ). How may this coupling be formulated when the fine structures of exchange have disappeared during model reduction? How can be expressed the interactions between models at different levels (in a multi-level modeling)? The ultimate question would be: how to choose the right level of modeling with respect to performance requirements?

In the multi-physical numerical simulation, two kinds of uncertainties then coexist: the uncertainty due to substitution of high-fidelity models with approximated reduced-order models, and the uncertainty due to the new coupling structure between reduced-order models.

According to the previous discussion, the uncertainty treatment in a multi-physical and multi-level modeling implies a large range of issues, for instance numerical resolutions of PDE (which do not enter into the research topics of *Regularity* ). Our goal is to contribute to the theoretical arsenal that allows to fly among the different levels of modeling (and then, among the existing numerical simulations).
We will focus on the following three axes:

In the case of a phenomenon represented by two coupled partial differential equations whose resolution is represented by reduced-order models, how to define a probabilistic model of the coupling errors? In connection with our theoretical development, we plan to characterize the regularity of this error in order to quantify its distribution. This research axis is supported by an ANR grant (OPUS project).

The multi-level modeling assumes the ability to choose the right level of details for the models in adequacy to the goals of the study. In order to do that, a rigorous mathematical definition of the notion of *model fineness/granularity* would be very helpful. Again, a precise analysis of the fine regularity of stochastic models is expected to give elements toward a precise definition of granularity.
This research axis is supported by a a Pôle SYSTEM@TIC grant (EHPOC project), and also by a collaboration with EADS.

Some fine characteristics of the phenomenological model may be used to define the probabilistic behaviour of its variability. The action of modeling a phenomena can be seen as an interpolation issue between given observations. This interpolation can be driven by physical evolution equations or fine analytical description of the physical quantities. We are convinced that Hölder regularity is an essential parameter in that context, since it captures how variations at a given point induce variations at its neighbors. Stochastic processes with prescribed regularity (see section ) have already been used to represent various fluctuating phenomena: Internet traffic, financial data, ocean floor. We believe that these models should be relevant to describe solutions of PDE perturbed by uncertain (random) coefficients or limit conditions. This research axis is supported by a Pôle SYSTEM@TIC grant (CSDL project).

The preliminary design of complex systems can be described as an exploration process of a so-called design space, generated by the global parameters. An interactive exploration, with a decisional visualization goal, needs reduced-order models of the involved physical phenomena. We are convinced that the local regularity of phenomena is a relevant quantity to drive these approximated models. Roughly speaking, in order to be representative, a model needs more informations where the fluctuations are the more important (and consequently, where irregularity is the more important).

In collaboration with Dassault Aviation, EDF and EADS, we study how the local regularity can provide a good quantification of the concept of *granularity* of a model, in order to select the good level of fidelity adapted to the requiered precision.

Our works in that field can be expressed into:

The definition and the study of stochastic partial differential equations driven by processes with prescribed regularity (that do not enter into the classical theory of stochastic integration).

The study of the evolution of the local regularity inside stochastic partial differential equations (SPDE). The stochastic 2-microlocal analysis should provide informations about the local regularity of the solutions, in function of the coefficients of the equations. The knowledge of the fine behaviour of the solution of the SPDE will provide important informations in the view of numerical simulations.

**ECG analysis and modelling**

ECG and signals derived from them are an important source of
information in the detection of various pathologies, including *e.g.* congestive heart failure, arrhythmia and sleep apnea. The fact that the
irregularity of ECG bears some information on the condition of the heart
is well documented (see *e.g.* the web resource http://

First, we use refined regularity characterizations, such as the regularization dimension, 2-microlocal analysis and advanced multifractal spectra for a more precise analysis of ECG data. This requires in particular to test current estimation procedures and to develop new ones.

Second, we build stochastic processes that mimic in a faithful way some features of the dynamics of ECG. For instance, the local regularity of RR intervals, estimated in a parametric way based on a modelling by an mBm, displays correlations with the amplitude of the signal, a feature that seems to have remained unobserved so far . In other words, RR intervals behave as SRP. We believe that modeling in a simplified way some aspects of the interplay between the sympathetic and parasympathetic systems might lead to an SRP, and to explain both this self-regulating property and the reasons behind the observed multifractality of records. This will open the way to understanding how these properties evolve under abnormal behaviour.

**Pharmacodynamics and patient drug compliance**

Poor adherence to treatment is a worldwide problem that threatens
efficacy of therapy, particularly in the case of chronic
diseases. Compliance to pharmacotherapy can range from *i.e.*, drugs are administered at a fixed
dosage. However, the drug concentration-time curve is often influenced
by the random drug input generated by patient poor adherence behaviour,
inducing erratic therapeutic outcomes. Following work already
started in Montréal , , we consider stochastic processes induced by
taking into account the random drug intake induced by various
compliance patterns. Such studies have been made possible by
technological progress, such as the “medication event monitoring
system”, which allows to obtain data describing the behaviour of
patients.

We use different approaches to study this problem: statistical methods where enough data are available, model-based ones in presence of qualitative description of the patient behaviour. In this latter case, piecewise deterministic Markov processes (PDP) seem a promising path. PDP are non-diffusion processes whose evolution follows a deterministic trajectory governed by a flow between random time instants, where it undergoes a jump according to some probability measure . There is a well-developed theory for PDP, which studies stochastic properties such as extended generator, Dynkin formula, long time behaviour. It is easy to cast a simplified model of non-compliance in terms of PDP. This has allowed us already to obtain certain properties of interest of the random concentration of drug . In the simplest case of a Poisson distribution, we have obtained rather precise results that also point to a surprising connection with infinite Bernouilli convolutions , , . Statistical aspects remain to be investigated in the general case.

FracLab was developed for two main purposes:

propose a general platform allowing research teams to avoid the need to re-code basic and advanced techniques in the processing of signals based on (local) regularity.

provide state of the art algorithms allowing both to disseminate new methods in this area and to compare results on a common basis.

FracLab is a general purpose signal and image processing toolbox based on fractal, multifractal and local regularity methods. FracLab can be approached from two different perspectives:

(multi-) fractal and local regularity analysis: A large number of procedures allow to compute various quantities associated with 1D or 2D signals, such as dimensions, Hölder and 2-microlocal exponents or multifractal spectra.

Signal/Image processing: Alternatively, one can use FracLab directly to perform many basic tasks in signal processing, including estimation, detection, denoising, modeling, segmentation, classification, and synthesis.

A graphical interface makes FracLab easy to use and intuitive. In addition, various wavelet-related tools are available in FracLab.

FracLab is a free software. It mainly consists of routines
developed in MatLab or C-code interfaced with MatLab.
It runs under Linux, MacOS and Windows environments. In addition,
a “stand-alone” version (*i.e.* which does not require
MatLab to run) is available.

Fraclab has been downloaded several thousands of times in the last years
by users all around the world. A few dozens
laboratories seem to use it regularly, with more than two hundreds registered users.
Our ambition is to make it the
standard in fractal softwares for signal and image processing
applications. We have signs that this is starting to become
the case. To date, its use has been acknowledged in more than two hundreds
research papers in various areas such as astrophysics, chemical engineering,
financial modeling, fluid dynamics, internet and road traffic analysis, image and signal processing,
geophysics, biomedical applications, computer science, as well as in mathematical studies in analysis and
statistics (see http://

We have produced this year a major release of FracLab (version 2.1).

*In collaboration with Sylvain Corlay (Paris 6 University).*

We have considered the following model, which is an extension of the fractional Hull and White model proposed in : under the risk-neutral measure, the forward price of a risky asset is the solution of the S.D.E.

where

Using functional quantization techniques, it is possible to compute numerically implied forward start volatilities for this
model. Using an adequate

*In collaboration with Prof. Ely Merzbach (Bar Ilan university, Israel).*

In , the class of set-indexed Lévy processes is considered using the stationarity property defined for the set-indexed fractional Brownian motion in .
The general framework of Ivanoff-Merzbach allows to consider standard properties of stochastic processes (e.g. martingale and Markov properties) in the set-indexed context.
Processes are indexed by a collection

A set-indexed process *set-indexed Lévy process* if the following conditions hold

the increments of

then

On the contrary to previous works of Adler and Feigin (1984) on one hand, and Bass and Pyke (1984) one the other hand, the increment stationarity property allows to obtain explicit expressions for the finite-dimensional distributions of a set-indexed Lévy process. From these, we obtained a complete characterization in terms of Markov properties.

Among the various definitions for Markov property of a SI process, we considered the

where *transition system* if
the following conditions are satisfied:

For all

For all

A transition system

*spatially homogeneous* if for all

* $m$-homogeneous* if

i.e.

A set-indexed process *Markov* if

where

Balan-Ivanoff (2002) proved that any SI process with independent increments is a

**Theorem**
*Let $X=\{{X}_{U};\phantom{\rule{0.277778em}{0ex}}U\in \mathcal{A}\}$ be a set-indexed process with definite increments. The two following assertions are equivalent:*

This result is strengthened in the following characterization of set-indexed Lévy processes as Markov processes with homogeneous transition systems.

**Theorem**
*Let $X=\{{X}_{U};\phantom{\rule{0.277778em}{0ex}}U\in \mathcal{A}\}$ be a set-indexed process with definite increments and satisfying the stochastic continuity property.*

*The two following assertions are equivalent:*

*Consequently, if $\mathcal{Q}$ is a transition system which is both spatially homogeneous and $m$-homogeneous, then there exists a set-indexed process $X$ which is a $\mathcal{Q}$-Markov process.
*

where

**Theorem 0.1**
Let

Then, the local and pointwise Hölder exponents of

and if Assumption

Consequently, since the collection

A classical result states that any (multiparameter) stochastic process has a separable modification, thus ensuring the measurability property of the sample paths. We extend this result to set-indexed processes.

Let *second-countable*, ie there exists a countable subset

A process *separable* if there exists an at most countable set

This definition is different of the one found in , where the space is “linear", in that this author considers the previous equation only when

**Theorem 0.2 (Doob's separability theorem)**
Any

If

and

Then, an * $\mathcal{C}$-Markov* with respect to a filtration

for all

The

A

and

Similarly to the classic Markovian theory, is is proved in that the initial distribution

Projections on elementary flows are Markovian;

Conditional independence of natural filtrations;

Strong Markov property.

The class of set-indexed Lévy processes defined and studied in offers examples of

where

Another non-trivial example of

where

In the particular case of multiparameter processes, corresponding to the indexing collection

This ongoing work focuses on the fine regularity of one-parameter Lévy processes. The main idea of this study is to use the framework of stochastic 2-microlocal analysis (introduced and developed in ,) to refine sample paths results obtained in .

The latter describes entirely the multifractal spectrum of Lévy processes, i.e. the Hausdorff geometry of level sets *iso-Hölder sets* of

The multifractal spectrum is itself defined as the localized Hausdorff dimension of the previous sets, i.e.

where the Blumenthal-Getoor exponent

Since classic multifractal analysis focuses on the pointwise exponent, it is natural from our point of view to integrate the 2-microlocal frontier into this description. More precisely, we focus on the dichotomy usual/unusual regularity, corresponding to the sets

The collection

Then, our main result states that sample paths of a Lévy process

Furthermore, the collection of sets

These results clearly extend those obtained in since we know that the pointwise exponent is completely characterize by the 2-microlocal frontier. Moreover, it also proves that from a Hausdorff dimension point of view, the common regularity is a 2-microlocal frontier with a slope equal to one.

Self similar processes with stationary increments (SSSI processes) have been studied for a long time due to their importance both in theory and in practice. Such processes appear as limits in various renormalisation procedures . In applications, they occur in various fields such as hydrology, biomedicine and image processing. The simplest SSSI processes are simply Brownian motion and, more generally, Lévy stable motions. Apart from these cases, the best known such process is probably fractional Brownian motion (fBm). A construction of SSSI processes that generalizes fBm to higher order Wiener chaoses was proposed in . These processes read

where

In , we define a class of such processes by the following multiple Wiener-Itô integral representation:

where

Almost surely,

From this representation, we get several results about this class of processes. Namely:

There exists a strictly positive random variable

There exists a strictly positive random variable

Using an estimate from , we compute the uniform almost sure pointwise Hölder exponent of

We get the following result:

Almost surely,

where the coordinates are independent copies of the process

Almost surely,

And,

where

*In collaboration with D. La Torre, University of Milan.*

We study certain economic growth models where we add a source of randomness to make the evolution equations more realistic. We have studied two particular models:

An augmented Uzawa-Lucas growth model where technological progress
is modelled as the solution of a stochastic differential equation driven by a Lévy or an additive process.
This allows for a more faithful description of reality by taking into account discontinuities in the evolution of the level of technology. In details, we consider a closed economy in which there is single good which is produced by combining physical capital

where

We assume that the level of technology evolves according to the following stochastic differential equation:

where

and

for

With a CIES utility function, the optional inter-temporal decision problem can be formulated as

where

We have been able to solve this program under some simplifying assumptions. Numerical simulations allow one to assess precisely the effect of (tempered) multistable noise on the model.

A stochastic demographic jump shocks in a multi-sector growth model with physical and human capital accumulation.
This models allows one to take into account sudden changes in population size, due for instance to wars or natural
catastrophes. The laws of motions of physical capital

with initial conditions

We assume that the population size evolves according to the following stochastic differential equation:

with initial condition

Here again, we are able to solve an optimisation program under some simplifying assumptions. This sheds light on the effect of demographic shocks on macroeconomic growth.

CSDL (Complex Systems Design Lab) project of the Pôle de Compétitivité SYSTEM@TIC PARIS-REGION (11/2009-10/2012). The goal of the project is the development of a scientific plateform of decisional visualization for preliminary design of complex system. Industrial partners include Dassault Aviation, EADS, EDF, MBDA and Renault. Academic partners include ECP, Ecole des Mines de Paris, ENS Cachan, Inria ands Supelec.

Erick Herbin is member of the CNRS Research Groups:

GDR Mascot Num, devoted to stochastic analysis methods for codes and numerical treatment;

GDR Math-Entreprise, devoted to mathematical modeling of industrial issues.

Regularity collaborates with Bar Ilan university on theoretical developments around set-indexed fractional Brownian motion and set-indexed Lévy processes (invitations of Erick Herbin in Israël during five months in 2006, 2007, 2008, 2009 and 2011 and invitation of Prof. Ely Merzbach at Ecole Centrale Paris in 2008, 2009, 2010 and 2011). The PhD thesis of Alexandre Richard is co-supervised by Erick Herbin and Ely Merzbach.

Erick Herbin was invited to the Mathematics Colloquium (Bar Ilan University, Israel) in July, 2012. Talk: "Haudorff dimension of the graph of Gaussian processes".

Regularity collaborates with Michigan State University (Prof. Yimin Xiao) on the study of fine regularity of multiparameter fractional Brownian motion (invitation of Erick Herbin at East Lansing in 2010).

Regularity collaborates with St Andrews University (Prof. Kenneth Falconer) on the study of multistable processes.

Regularity collaborates with Acadia University (Prof. Franklin Mendivil) on the study of fractal strings, certain fractals sets, and the study of the regularization dimension.

Regularity collaborates with Milan University (Prof. Davide La Torre) on the study of certain economic growth models. A joint project has just been selected in the frame of the Galilée program.

Professors Ely Merzbach from Bar Ilan University and Franklin Mendivil from Acadia University have visited the team this year.

Ankush GOYAL (from May 2012 until Jul 2012)

Subject: Stochastic calculus with multistable Lévy motion and applications in finance

Institution: IIT Delhi (India)

Jacques Lévy Véhel is associate editor of the journal *Fractals*.

Jacques Lévy Véhel was invited for two weeks at the program *Stochastic Analysis* organized by the Bernoulli
Center, Lausanne.

Erick Herbin is head of the Mathematics Department at Ecole Centrale Paris since 2011.

Erick Herbin is in charge of the “Mathematical Modeling and Numerical Simulation” Program in the Applied Mathematics option of Ecole Centrale Paris.

Erick Herbin is in charge of the Probability course at Ecole Centrale Paris (20h).

Erick Herbin is in charge of the Advanced Probability course at Ecole Centrale Paris (30h).

Erick Herbin and Jacques Lévy Véhel are in charge of the Brownian Motion and Stochastic Calculus course at Ecole Centrale Paris (30h).

Erick Herbin gives tutorials on Real Analysis and Integration at Ecole Centrale Paris (10h).

Jacques Lévy Véhel teaches a course on Wavelets and Fractals at Ecole Centrale Nantes (8h).

Paul Balança and Alexandre Richard are teaching assistants since October 2010 at Ecole Centrale Paris:

Paul Balança gives tutorials on Probability, Real Analysis and Integration at Ecole Centrale Paris (20h).

Alexandre Richard gives tutorials on Probability and Statistics at Ecole Centrale Paris (20h).

Paul Balança and Alexandre Richard give tutorials on Advanced Probability at Ecole Centrale Paris (17h).

Paul Balança, Erick Herbin and Alexandre Richard supervise several student's research projects in the field of Mathematics at Ecole Centrale Paris.

PhD : Joachim Lebovits, Stochastic Calculus With Respect to Multifractional Brownian Motion and Applications to Finance, Université de Paris 6, defended on January 25, 2012, supervised by J. Lévy Véhel and M. Yor.

PhD in progress : Benjamin Arras, Self-similar processes in higher order chaoses, started in September 2011, supervised by J. Lévy Véhel.

PhD in progress : Paul Balança, Stochastic 2-microlocal analysis of SDEs, started in October 2010, supervised by Erick Herbin.

PhD in progress : Alexandre Richard, Regularity of set-indexed processes and construction of a set-indexed process with varying local regularity , started in October 2010, supervised by Erick Herbin and E. Merzbach.

J. Lévy Véhel has written articles for Interstices and the "le saviez-vous" page of Inria web site.