Many phenomena of interest are analyzed and controlled through graphs or n-dimensional images. Often, these graphs have an
*irregular aspect*, whether the studied phenomenon is of natural or artificial origin. In the first class, one may cite natural landscapes, most biological signals and images (EEG, ECG, MR
images, ...), and temperature records. In the second class, prominent examples include financial logs and TCP traces.

Such irregular phenomena are usually not adequately described by purely deterministic models, and a probabilistic ingredient is often added. Stochastic processes allow to take into account, with a firm theoretical basis, the numerous microscopic fluctuations that shape the phenomenon.

In general, it is a wrong view to believe that irregularity appears as an epiphenomenon, that is conveniently dealt with by introducing randomness. In many situations,
and in particular in some of the examples mentioned above, irregularity is a core ingredient that cannot be removed without destroying the phenomenon itself. In some cases, irregularity is even
a necessary condition for proper functioning. A striking example is that of ECG: an ECG is inherently irregular, and, moreover, in a mathematically precise sense, an
*increase*in its regularity is strongly correlated with a
*degradation*of its condition.

In fact, in various situations, irregularity is a crucial feature that can be used to assess the behaviour of a given system. For instance, irregularity may the result of two or more
sub-systems that act in a concurrent way to achieve some kind of equilibrium. Examples of this abound in nature (
*e.g.*the sympathetic and parasympathetic systems in the regulation of the heart). For artifacts, such as financial logs and TCP traffic, irregularity is in a sense an unwanted feature,
since it typically makes regulations more complex. It is again, however, a necessary one. For instance, efficiency in financial markets requires a constant flow of information among agents,
which manifests itself through permanent fluctuations of the prices: irregularity just reflects the evolution of this information.

The aim of
*Regularity*is a to develop a coherent set of methods allowing to model such “essentially irregular” phenomena in view of managing the uncertainties entailed by their irregularity.

Indeed, essential irregularity makes it more to difficult to study phenomena in terms of their description, modeling, prediction and control. It introduces
*uncertainties*both in the measurements and the dynamics. It is, for instance, obviously easier to predict the short time behaviour of a smooth (
*e.g.*

The modeling of essentially irregular phenomena is an important challenge, with an emphasis on understanding the sources and functions of this irregularity. Probabilistic tools are well-adapted to this task, provided one can design stochastic models for which the regularity can be measured and controlled precisely. Two points deserve special attention:

first, the study of regularity has to be
*local*. Indeed, in most applications, one will want to act on a system based on local temporal or spatial information. For instance, detection of arrhythmias in ECG or of krachs in
financial markets should be performed in “real time”, or, even better, ahead of time. In this sense, regularity is a
*local*indicator of the
*local*health of a system.

Second, although we have used the term “irregularity” in a generic and somewhat vague sense, it seems obvious that, in real-world phenomena, regularity comes in many colors, and a rigorous analysis should distinguish between them. As an example, at least two kinds of irregularities are present in financial logs: the local “roughness” of the records, and the local density and height of jumps. These correspond to two different concepts of regularity (in technical terms, Hölder exponents and local index of stability), and they both contribute a different manner to financial risk.

In view of the above, the
*Regularity*team focuses on the design of methods that:

define and study precisely various relevant measures of local regularity,

allow to build stochastic models versatile enough to mimic the rapid variations of the different kinds of regularities observed in real phenomena,

allow to estimate as precisely and rapidly as possible these regularities, so as to alert systems in charge of control.

Our aim is to address the three items above through the design of mathematical tools in the field of probability (and, to a lesser extent, statistics), and to apply these tools to uncertainty management as described in the following section. We note here that we do not intend to address the problem of controlling the phenomena based on regularity, that would naturally constitute an item 4 in the list above. Indeed, while we strongly believe that generic tools may be designed to measure and model regularity, and that these tools may be used to analyze real-world applications, in particular in the field of uncertainty management, it is clear that, when it comes to control, application-specific tools are required, that we do not wish to address.

The research topics of the
*Regularity*team can be roughly divided into two strongly interacting axes, corresponding to two complementary ways of studying regularity:

developments of tools allowing to characterize, measure and estimate various notions of local regularity, with a particular emphasis on the stochastic frame,

definition and fine analysis of stochastic models for which some aspects of local regularity may be prescribed.

These two aspects are detailed in sections and below.

**Fractional Dimensions**

Although the main focus of our team is on characterizing
*local*regularity, on occasions, it is interesting to use a
*global*index of regularity. Fractional dimensions provide such an index. In particular, the
*regularization dimension*, that was defined in
, is well adapted to the study stochastic processes, as its
definition allows to build robust estimators in an easy way. Since its introduction, regularization dimension has been used by various teams worldwide in many different applications including
the characterization of certain stochastic processes, statistical estimation, the study of mammographies or galactograms for breast carcinomas detection, ECG analysis for the study of
ventricular arrhythmia, encephalitis diagnosis from EEG, human skin analysis, discrimination between the nature of radioactive contaminations, analysis of porous media textures, well-logs data
analysis, agro-alimentary image analysis, road profile analysis, remote sensing, mechanical systems assessment, analysis of video games, ...(see
http://

**Hölder exponents**

The simplest and most popular measures of local regularity are the pointwise and local Hölder exponents. For a stochastic process

and

Although these quantities are in general random, we will omit as is customary the dependency in

The random functions

The pointwise Hölder exponent is a very versatile tool, in the sense that the set of pointwise Hölder functions of continuous functions is quite large (it coincides with the set of lower
limits of sequences of continuous functions
). In this sense, the pointwise exponent is often a more precise
tool (
*i.e.*it varies in a more rapid way) than the local one, since local Hölder functions are always lower semi-continuous. This is why, in particular, it is the exponent that is used as a
basis ingredient in multifractal analysis (see section
). For certain classes of stochastic processes, and most notably Gaussian
processes, it has the remarkable property that, at each point, it assumes an almost sure value
. SRP, mBm, and processes of this kind (see sections
and
) rely on the sole use of the pointwise Hölder exponent for prescribing the
regularity.

However,

Another, related, drawback of the pointwise exponent is that it is not stable under integro-differentiation, which sometimes makes its use complicated in applications. Again, the local
exponent provides here a useful complement to

Both exponents have proved useful in various applications, ranging from image denoising and segmentation to TCP traffic characterization. Applications require precise estimation of these exponents.

**Stochastic 2-microlocal analysis**

Neither the pointwise nor the local exponents give a complete characterization of the local regularity, and, although their joint use somewhat improves the situation, it is far from yielding the complete picture.

A fuller description of local regularity is provided by the so-called
*2-microlocal analysis*, introduced by J.M. Bony
. In this frame, regularity at each point is now specified by two
indices, which makes the analysis and estimation tasks more difficult. More precisely, a function
*2-microlocal space*

for all
*2-microlocal spectrum*. This spectrum provide a wealth of information on the local regularity.

In , we have laid some foundations for a stochastic version of 2-microlocal analysis. We believe this will provide a fine analysis of the local regularity of random processes in a direction different from the one detailed for instance in .We have defined random versions of the 2-microlocal spaces, and given almost sure conditions for continuous processes to belong to such spaces. More precise results have also been obtained for Gaussian processes. A preliminary investigation of the 2-microlocal behaviour of Wiener integrals has been performed.

**Multifractal analysis of stochastic processes**

A direct use of the local regularity is often fruitful in applications. This is for instance the case in RR analysis or terrain modeling. However, in some situations, it is interesting to
supplement or replace it by a more global approach known as
*multifractal analysis*(MA). The idea behind MA is to group together all points with same regularity (as measured by the pointwise Hölder exponent) and to measure the “size” of the sets
thus obtained
,
,
. There are mainly two ways to do so, a geometrical and a
statistical one.

In the geometrical approach, one defines the
*Hausdorff multifractal spectrum*of a process or function

The statistical path to MA is based on the so-called
*large deviation multifractal spectrum*:

where:

and
*i.e.*:

Here,

The large deviation spectrum is typically easier to compute and to estimate than the Hausdorff one. In addition, it often gives more relevant information in applications.

Under very mild conditions (
*e.g.*for instance, if the support of
*Legendre multifractal spectrum*. To do so, one basically interprets the spectrum

with the convention

The Legendre multifractal spectrum of

To see the relation between

where
*weak multifractal formalism*holds,
*i.e.*
*strong multifractal formalism*.

Multifractal spectra subsume a lot of information about the distribution of the regularity, that has proved useful in various situations. A most notable example is the strong correlation reported recently in several works between the narrowing of the multifractal spectrum of ECG and certain pathologies of the heart , . Let us also mention the multifractality of TCP traffic, that has been both observed experimentally and proved on simplified models of TCP , .

**Another colour in local regularity: jumps**

As noted above, apart from Hölder exponents and their generalizations, at least another type of irregularity may sometimes be observed on certain real phenomena: discontinuities, which occur
for instance on financial logs and certain biomedical signals. In this frame, it is of interest to supplement Hölder exponents and their extensions with (at least) an additional index that
measures the local intensity and size of jumps. This is a topic we intend to pursue in full generality in the near future. So far, we have developed an approach in the particular frame of
*multistable processes*. We refer to section
for more details.

The second axis in the theoretical developments of the
*Regularity*team aims at defining and studying stochastic processes for which various aspects of the local regularity may be prescribed.

**Multifractional Brownian motion**

One of the simplest stochastic process for which some kind of control over the Hölder exponents is possible is probably fractional Brownian motion (fBm). This process was defined by Kolmogorov and further studied by Mandelbrot and Van Ness, followed by many authors. The so-called “moving average” definition of fBm reads as follows:

where

Although varying

It is possible to generalize fBm to obtain a Gaussian process for which the pointwise Hölder exponent may be tuned at each point: the
*multifractional Brownian motion (mBm)*is such an extension, obtained by substituting the constant parameter
*regularity function*

mBm was introduced independently by two groups of authors: on the one hand, Peltier and Levy-Vehel
defined the mBm

On the other hand, Benassi, Jaffard and Roux
defined the mBm from the harmonizable representation of the fBm,
*i.e.*:

where

The Hölder exponents of the mBm are prescribed almost surely: the pointwise Hölder exponent is

The fact that the local regularity of mBm may be tuned
*via*a functional parameter has made it a useful model in various areas such as finance, biomedicine, geophysics, image analysis, .... A large number of studies have been devoted worldwide
to its mathematical properties, including in particular its local time. In addition, there is now a rather strong body of work dealing the estimation of its functional parameter,
*i.e.*its local regularity. See
http://

**Self-regulating processes**

We have recently introduced another class of stochastic models, inspired by mBm, but where the local regularity, instead of being tuned “exogenously”, is a function of the amplitude. In
other words, at each point
*self-regulating process*(SRP). The particular process obtained by adapting adequately mBm is called the self-regulating multifractional process
. Another instance is given by modifying the Lévy construction of
Brownian motion
. The motivation for introducing self-regulating processes is
based on the following general fact: in nature, the local regularity of a phenomenon is often related to its amplitude. An intuitive example is provided by natural terrains: in young mountains,
regions at higher altitudes are typically more irregular than regions at lower altitudes. We have verified this fact experimentally on several digital elevation models
. Other natural phenomena displaying a relation between amplitude
and exponent include temperatures records and RR intervals extracted from ECG
.

To build the SRMP, one starts from a field of fractional Brownian motions

the affine rescaling between

where

An example of a two dimensional SRMP with function

We believe that SRP open a whole new and very promising area of research.

**Multistable processes**

Non-continuous phenomena are commonly encountered in real-world applications,
*e.g.*financial records or EEG traces. For such processes, the information brought by the Hölder exponent must be supplemented by some measure of the density and size of jumps. Stochastic
processes with jumps, and in particular Lévy processes, are currently an active area of research.

The simplest class of non-continuous Lévy processes is maybe the one of stable processes
. These are mainly characterized by a parameter
*stability index*(

In line with our quest for the characterization and modeling of various notions of local regularity, we have defined
*multistable processes*. These are processes which are “locally” stable, but where the stability index

More formally, a multistable process is a process which is, at each time

where the limit is understood either in finite dimensional distributions or in the stronger sense of distributions. Note

One approach to defining multistable processes is similar to the one developed for constructing mBm
: we consider fields of stochastic processes

A particular class of multistable processes, termed “linear multistable multifractional motions” (lmmm) takes the following form
,
. Let

where

In fact, lmmm are somewhat more general than said above: indeed, the couple

**Multiparameter processes**

In order to use stochastic processes to represent the variability of multidimensional phenomena, it is necessary to define extensions for indices in

These works have highlighted the difficulty of giving satisfactory definitions for increment stationarity, Hölder continuity and covariance structure which are not closely dependent on the
structure of

A promising improvement in the definition of multiparameter extensions is the concept of
*set-indexed processes*. A set-indexed process is a process whose indices are no longer “times” or “locations” but may be some compact connected subsets of a metric measure space. In the
simplest case, this framework is a generalization of the classical multiparameter processes
: usual multiparameter processes are set-indexed processes where
the indexing subsets are simply the rectangles

Set-indexed processes allow for greater flexibility, and should in particular be useful for the modeling of censored data. This situation occurs frequently in biology and medicine, since, for instance, data may not be constantly monitored. Censored data also appear in natural terrain modeling when data are acquired from sensors in presence of hidden areas. In these contexts, set-indexed models should constitute a relevant frame.

A set-indexed extension of fBm is the first step toward the modeling of irregular phenomena within this more general frame. In
, the so-called
*set-indexed fractional Brownian motion (sifBm)*was defined as the mean-zero Gaussian process

where

This process appears to be the only set-indexed process whose projection on increasing paths is a one-parameter fractional Brownian motion
. The construction also provides a way to define fBm's extensions
on non-euclidean spaces,
*e.g.*indices can belong to the unit hyper-sphere of

In the specific case of the indexing collection
*multiparameter fractional Brownian motion (MpfBm)*. This process differs from the Lévy fractional Brownian motion and the fractional Brownian sheet, which are also multiparameter
extensions of fBm (but do not derive from set-indexed processes). The local behaviour of the sample paths of the MpfBm has been studied in
. The self-similarity index

The increment stationarity property for set-indexed processes, previously defined in the study of the sifBm, allows to consider set-indexed processes whose increments
are independent and stationary. This generalizes the definition of Bass-Pyke and Adler-Feigin for Lévy processes indexed by subsets of

Our theoretical works are motivated by and find natural applications to real-world problems in a general frame generally referred to as uncertainty management, that we describe now.

Since a few decades, modeling has gained an increasing part in complex systems design in various fields of industry such as automobile, aeronautics, energy, etc. Industrial design involves several levels of modeling: from behavioural models in preliminary design to finite-elements models aiming at representing sharply physical phenomena. Nowadays, the fundamental challenge of numerical simulation is in designing physical systems while saving the experimentation steps.

As an example, at the early stage of conception in aeronautics, numerical simulation aims at exploring the design parameters space and setting the global variables such that target performances are satisfied. This iterative procedure needs fast multiphysical models. These simplified models are usually calibrated using high-fidelity models or experiments. At each of these levels, modeling requires control of uncertainties due to simplifications of models, numerical errors, data imprecisions, variability of surrounding conditions, etc.

One dilemma in the design by numerical simulation is that many crucial choices are made very early, and thus when uncertainties are maximum, and that these choices have a fundamental impact on the final performances.

Classically, coping with this variability is achieved through
*model registration*by experimenting and adding fixed
*margins*to the model response. In view of technical and economical performance, it appears judicious to replace these fixed margins by a rigorous analysis and control of risk. This may be
achieved through a probabilistic approach to uncertainties, that provides decision criteria adapted to the management of unpredictability inherent to design issues.

From the particular case of aircraft design emerge several general aspects of management of uncertainties in simulation. Probabilistic decision criteria, that translate decision making into mathematical/probabilistic terms, require the following three steps to be considered :

build a probabilistic description of the fluctuations of the model's parameters (
*Quantification*of uncertainty sources),

deduce the implication of these distribution laws on the model's response (
*Propagation*of uncertainties),

and determine the specific influence of each uncertainty source on the model's response variability (
*Sensitivity Analysis*).

The previous analysis now constitutes the framework of a general study of uncertainties. It is used in industrial contexts where uncertainties can be represented by
*random variables*(unknown temperature of an external surface, physical quantities of a given material, ... at a given
*fixed time*). However, in order for the numerical models to describe with high fidelity a phenomenon, the relevant uncertainties must generally depend on time or space variables.
Consequently, one has to tackle the following issues:

*How to capture the distribution law of time (or space) dependent parameters, without directly accessible data?*The distribution of probability of the continuous time (or space)
uncertainty sources must describe the links between variations at neighbor times (or points). The local and global regularity are important parameters of these laws, since it describes how
the fluctuations at some time (or point) induce fluctuations at close times (or points). The continuous equations representing the studied phenomena should help
*to propose models for the law of the random fields*. Let us notice that interactions between various levels of modeling might also be used to derive distributions of probability at
the lowest one.

The navigation between the various natures of models needs a kind of
*metric*which could
*mathematically describe the notion of granularity or fineness*of the models. Of course, the local regularity will not be totally absent of this mathematical definition.

All the various levels of conception, preliminary design or high-fidelity modelling, require
*registrations by experimentation*to reduce model errors. This
*calibration*issue has been present in this frame since a long time, especially in a deterministic optimization context. The random modeling of uncertainty requires the definition of a
systematic approach. The difficulty in this specific context is: statistical estimation with few data and estimation of a function with continuous variables using only discrete setting of
values.

Moreover, a multi-physical context must be added to these questions. The complex system design is most often located at the interface between several disciplines. In
that case, modeling relies on a coupling between several models for the various phenomena and design becomes a
*multidisciplinary optimization*problem. In this uncertainty context, the real challenge turns robust optimization to manage technical and economical risks (risk for non-satisfaction of
technical specifications, cost control).

We participate in the uncertainties community through several collaborative research projects (ANR and Pôle SYSTEM@TIC), and also through our involvement in the MASCOT-NUM research group (GDR of CNRS). In addition, we are considering probabilistic models as phenomenological models to cope with uncertainties in the DIGITEO ANIFRAC project. As explained above, we focus on essentially irregular phenomena, for which irregularity is a relevant quantity to capture the variability (e.g. certain biomedical signals, terrain modeling, financial data, etc.). These will be modeled through stochastic processes with prescribed regularity.

The design of a complex (mechanical) system such as aircraft, automobile or nuclear plant involves numerical simulation of several interacting physical phenomena: CFD and structural dynamics, thermal evolution of a fluid circulation, ... For instance, they can represent the resolution of coupled partial differential equations using finite element method. In the framework of uncertainty treatment, the studied “phenomenological model" is a chaining of different models representing the various involved physical phenomena. As an example, the pressure field on an aircraft wing is the result of both aerodynamic and structural mechanical phenomena. Let us consider the particular case of two models of partial differential equations coupled by limit conditions. The direct propagation of uncertainties is impossible since it requires an exploration and then, many calls to costly models. As a solution, engineers use to build reduced-order models: the complex high-fidelity model is substituted with a CPU less costly model. The uncertainty propagation is then realized through the simplified model, taking into account the approximation error (see ).

Interactions between the various models are usually explicited at the finest level (cf. Fig. ). How may this coupling be formulated when the fine structures of exchange have disappeared during model reduction? How can be expressed the interactions between models at different levels (in a multi-level modeling)? The ultimate question would be: how to choose the right level of modeling with respect to performance requirements?

In the multi-physical numerical simulation, two kinds of uncertainties then coexist: the uncertainty due to substitution of high-fidelity models with approximated reduced-order models, and the uncertainty due to the new coupling structure between reduced-order models.

According to the previous discussion, the uncertainty treatment in a multi-physical and multi-level modeling implies a large range of issues, for instance numerical
resolutions of PDE (which do not enter into the research topics of
*Regularity*). Our goal is to contribute to the theoretical arsenal that allows to fly among the different levels of modeling (and then, among the existing numerical simulations). We will
focus on the following three axes:

In the case of a phenomenon represented by two coupled partial differential equations whose resolution is represented by reduced-order models, how to define a probabilistic model of the coupling errors? In connection with our theoretical development, we plan to characterize the regularity of this error in order to quantify its distribution. This research axis is supported by an ANR grant (OPUS project).

The multi-level modeling assumes the ability to choose the right level of details for the models in adequacy to the goals of the study. In order to do that, a rigorous
mathematical definition of the notion of
*model fineness/granularity*would be very helpful. Again, a precise analysis of the fine regularity of stochastic models is expected to give elements toward a precise definition of
granularity. This research axis is supported by a a Pôle SYSTEM@TIC grant (EHPOC project), and also by a collaboration with EADS.

Some fine characteristics of the phenomenological model may be used to define the probabilistic behaviour of its variability. The action of modeling a phenomena can be seen as an interpolation issue between given observations. This interpolation can be driven by physical evolution equations or fine analytical description of the physical quantities. We are convinced that Hölder regularity is an essential parameter in that context, since it captures how variations at a given point induce variations at its neighbors. Stochastic processes with prescribed regularity (see section ) have already been used to represent various fluctuating phenomena: Internet traffic, financial data, ocean floor. We believe that these models should be relevant to describe solutions of PDE perturbed by uncertain (random) coefficients or limit conditions. This research axis is supported by a Pôle SYSTEM@TIC grant (CSDL project).

**ECG analysis and modeling**

ECG and signals derived from them are an important source of information in the detection of various pathologies, including
*e.g.*congestive heart failure, arrhythmia and sleep apnea. The fact that the irregularity of ECG bears some information on the condition of the heart is well documented (see
*e.g.*the web resource
http://

First, we use refined regularity characterizations, such as the regularization dimension, 2-microlocal analysis and advanced multifractal spectra for a more precise analysis of ECG data. This requires in particular to test current estimation procedures and to develop new ones.

Second, we build stochastic processes that mimic in a faithful way some features of the dynamics of ECG. For instance, the local regularity of RR intervals, estimated in a parametric way based on a modeling by an mBm, displays correlations with the amplitude of the signal, a feature that seems to have remained unobserved so far . In other words, RR intervals behave as SRP. We believe that modeling in a simplified way some aspects of the interplay between the sympathetic and parasympathetic systems might lead to an SRP, and to explain both this self-regulating property and the reasons behind the observed multifractality of records. This will open the way to understanding how these properties evolve under abnormal behaviour.

**Pharmacodynamics and patient drug compliance**

Poor adherence to treatment is a worldwide problem that threatens efficacy of therapy, particularly in the case of chronic diseases. Compliance to pharmacotherapy can range from
*i.e.*, drugs are administered at a fixed dosage. However, the drug concentration-time curve is often influenced by the random drug input generated by patient poor adherence behaviour,
inducing erratic therapeutic outcomes. Following work already started in Montréal
,
, we consider stochastic processes induced by taking into account
the random drug intake induced by various compliance patterns. Such studies have been made possible by technological progress, such as the “medication event monitoring system”, which allows to
obtain data describing the behaviour of patients.

We use different approaches to study this problem: statistical methods where enough data are available, model-based ones in presence of qualitative description of the patient behaviour. In this latter case, piecewise deterministic Markov processes (PDP) seem a promising path. PDP are non-diffusion processes whose evolution follows a deterministic trajectory governed by a flow between random time instants, where it undergoes a jump according to some probability measure . There is a well-developed theory for PDP, which studies stochastic properties such as extended generator, Dynkin formula, long time behaviour. It is easy to cast a simplified model of non-compliance in terms of PDP. This has allowed us already to obtain certain properties of interest of the random concentration of drug . In the simplest case of a Poisson distribution, we have obtained rather precise results that also point to a surprising connection with infinite Bernouilli convolutions , , . Statistical aspects remain to be investigated in the general case.

FracLab was developed for two main purposes:

propose a general platform allowing research teams to avoid the need to re-code basic and advanced techniques in the processing of signals based on (local) regularity.

provide state of the art algorithms allowing both to disseminate new methods in this area and to compare results on a common basis.

FracLab is a general purpose signal and image processing toolbox based on fractal, multifractal and local regularity methods. FracLab can be approached from two different perspectives:

(multi-) fractal and local regularity analysis: A large number of procedures allow to compute various quantities associated with 1D or 2D signals, such as dimensions, Hölder and 2-microlocal exponents or multifractal spectra.

Signal/Image processing: Alternatively, one can use FracLab directly to perform many basic tasks in signal processing, including estimation, detection, denoising, modeling, segmentation, classification, and synthesis.

A graphical interface makes FracLab easy to use and intuitive. In addition, various wavelet-related tools are available in FracLab.

FracLab is a free software. It mainly consists of routines developed in MatLab or C-code interfaced with MatLab. It runs under Linux, MacOS and Windows environments. In addition, a
“stand-alone” version (
*i.e.*which does not require MatLab to run) is available.

Fraclab has been downloaded several thousands of times in the last years by users all around the world. A few dozens laboratories seem to use it regularly, with more than two hundreds
registered users. Our ambition is to make it the standard in fractal softwares for signal and image processing applications. We have signs that this is starting to become the case. To date, its
use has been acknowledged in more than two hundreds research papers in various areas such as astrophysics, chemical engineering, financial modeling, fluid dynamics, internet and road traffic
analysis, image and signal processing, geophysics, biomedical applications, computer science, as well as in mathematical studies in analysis and statistics (see
http://

The purpose of this work is to build a stochastic calculus with respect to (mBm) with a view to applications in finance and particularly to stochastic volatility models. We use an approach based on white noise theory.

where

Hence we define integral with respect to mBm of any process

where

for functions with sub exponential growth and where the last equality holds in

Once this stochastic calculus with respect to mBm is defined, we can solve differential equations arising in mathematical finance.

The results of this part may be found in . We assume that, under the risk-neutral measure, the forward price of a risky asset is the solution of the S.D.E.

where

where

Since the solution the previous S.D.E. is not explicit for

*In collaboration with Prof. Ely Merzbach (Bar Ilan University, Israel).*

In
, the class of set-indexed Lévy processes is considered using the
stationarity property defined for the set-indexed fractional Brownian motion in
. Following Ivanoff-Merzbach's definitions of an indexing
collection

a set-indexed process
*set-indexed Lévy process*if the following conditions hold

the increments of

On the contrary to previous works of Adler and Feigin (1984) on one hand, and Bass and Pyke (1984) one the other hand, the increment stationarity property allows to obtain explicit expressions for the finite-dimensional distributions of a set-indexed Lévy process. From these, we obtained a complete characterization in terms of Markov properties.

The question of continuity is more complex in the set-indexed setting than for real-parameter stochastic processes. For instance, the set-indexed Brownian motion can be not continuous for some indexing collection. We consider a weaker form of continuity, which studies the possibility of point jumps.

The
*point mass jump*of a set-indexed function

and for each
*pointwise-continuous*if

**Theorem**
*Let
$\{{X}_{U};\phantom{\rule{0.277778em}{0ex}}U\in \mathcal{A}\}$be a set-indexed Lévy process with Gaussian increments. Then for any
${U}_{max}\in \mathcal{A}$such that
$m\left({U}_{max}\right)<+\infty $, the sample paths of
$X$are almost surely pointwise-continuous inside
${U}_{max}$, i.e.*

*
*

In the general case, for all

for all

**Theorem**
*Let
$(\sigma ,\gamma ,\nu )$the generating triplet of the SI Lévy process
$X$.*

*Then
$X$can be decomposed as*

*where*

*where
${N}_{U}$is defined in (
) and the last term of (
) converges uniformly in
$U\subset {U}_{max}$(for any given
${U}_{max}\in \mathcal{A}$) as
$\u03f5\downarrow 0$,*

*and the processes
${X}^{\left(0\right)}$and
${X}^{\left(1\right)}$are independent.*

*In collaboration with Prof. Ely Merzbach (Bar Ilan University, Israel).*

In the set-indexed framework of Ivanoff and Merzbach (
), stochastic processes can be indexed not only by

Once this condition is established, we investigate the definition of Hölder coefficients for SI processes. From the real-parameter case, the most straightforward are the local (and
pointwise) Hölder exponents around

When the processes are Gaussian, a deterministic counterpart to this exponent is defined as it is in the real-parameter framework. For all

Given the particular structure of

On specific subclasses

and this definition is proved to be independent of

The last exponent which is studied is the exponent of pointwise continuity:

for all

All these results are finally applied to the SIfBm and the SI Ornstein-Ühlenbeck process ( ).

The 2-microlocal frontier gives a more complete picture of the regularity than classical pointwise and local Hölder exponents, which are widely used in the literature. Furthermore, it is stable under the action of (pseudo-)differential operators.

Our main goal was therefore to extend this result to any stochastic integral

where

where for any process

where

with the usual convention

As the previous result is based on Dubins-Schwarz representation theorem, it can be easily extended to characterize the regularity of time-changed multifractional Brownian motions. In this
case, we obtain a similar equation where

Using this last equality, we can obtain the regularity of the stochastic integral

In the particular case of an integration with respect to a Brownian motion

if

if

unless

Based on this last characterization, we were able to study the regularity of stochastic diffusions. In particular, we illustrated our purpose with the square of

This year, we concentrated on the following points:

Define a new type of multistable processes called tempered multistable processes.

Study the short time and long time behaviors of tempered multistable processes.

Compare the multistable Lévy processes defined by finite-dimensional distributions (characteristic functions), Poisson representation and series representation.

The idea of the construction of tempered multistable measure and processes comes from the paper . The interest of such processes is that they may be chosen to have moments of all orders. In addition, they are martingales. This will allow to construct stochastic (partial) differential equation driven by tempered multistable measures, which may be used to describe certain physical phenomena.

The characteristic function of a termpered multistable process

We have investigated the long time and short time behaviors this process:

Short time behavior:

Let

Then when

in finite-dimentional-distributions, where

and

Long time behavior:

Let

in finite-dimensional-distributions, where

Let us now describe our work on the multistable Lévy motion. For

There also exist a Poisson representation of multistable Lévy process

where

Finally, the series representation of multistable Lévy motion

where

We have proved that these three definitions yield the same process in law.

*In collaboration with Prof. Franklin Mendivil (Acadia University, Canada).*

We have extended the definition of fractal strings originally proposed in and modified in to deal with the local behaviour of fractal sets. This allows to analyze the pointwise oscillatory properties of locally self-similar sets ( ).

We have also analyzed in details the structure of a set build by "stacking" Cantor sets with continuously varying dimensions (see figure
). The resulting set, called "Christiane's hair" set or CH set, displays a number
of interesting properties. Each "strand of hair" is a

*In collaboration with P.E Lévy Véhel (University of Nice-Sophia-Antipolis and Banque Postale).*

In the past two years, we have developed models for investigating the probability distribution of drug concentration in the case of non-compliance. We have focused on two aspects of
practical relevance: the
*variability*of the concentration and the
*regularity*of its probability distribution. In a first article
, in a series of three, is considered the case of
multi-intravenous dosing using the simplest possible law to model random drug intake,
*i.e.*a homogeneous Poisson distribution. In a second article
, we consider the more realistic multi-oral model, and deal with
the complications brought by the first-order kinetics, which are essentially technical. Finally, in
, we put ourselves in a powerful mathematical frame, known as
*Piecewise Deterministic Markov process*(PDMP), that allows us to deal with general drug intake schedules, going beyond the homogeneous Poisson case. We use a PDMP to model the drug
concentration in the case of multiple intravenous doses. In this particular model, we consider that the doses administration regimen is modeled by a non-homogeneous Poisson process whose jump
rate is controlled by mean of a Markov chain. In this sense our PDMP model is a generalization to the continuos-models studied in
. In the following we detail our PDM model and the results
obtained in the multi-IV case, see
.

**The model setting**

Inspired by the PDMP model given in , , we consider a drug dosing stochastic regimen defined as follows.

Let us consider

The patient takes a dose

The time dose

We consider that these doses translate into immediate increases of the concentration by the value

We define

The process

with

**The characteristic function of the concentration**

The characteristic function

**Variability of the concentration**

where

**The distribution of limit concentration**

The characteristic function

Thus, the random variables

**Variability of the limit concentration**

We denote by

**Regularity of the limit concentration**

The characteristic function

where

This result will allow us to describe in detail aspects of the limit distribution that are important for assessing the efficacy of therapy.

*In collaboration with Dassault Aviation, EADS, EDF.*

The preliminary design of complex systems can be described as an exploration process of a so-called design space, generated by the global parameters. An interactive exploration, with a decisional visualization goal, needs reduced-order models of the involved physical phenomena. We are convinced that the local regularity of phenomena is a relevant quantity to drive these approximated models. Roughly speaking, in order to be representative, a model needs more informations where the fluctuations are the more important (and consequently, where irregularity is the more important).

In collaboration with Dassault Aviation, EDF and EADS, we study how the local regularity can provide a good quantification of the concept of
*granularity*of a model, in order to select the good level of fidelity adapted to a required precision.

Our works in that field can be expressed into:

The definition and the study of stochastic partial differential equations driven by processes with prescribed regularity (that do not enter into the classical theory of stochastic integration).

The study of the evolution of the local regularity inside stochastic partial differential equations (SPDE). Stochastic 2-microlocal analysis should provide informations about the local regularity of the solutions, in function of the coefficients of the equations. The knowledge of the fine behaviour of the solution of the SPDE will provide important informations in the view of numerical simulations.

Academic and industrial collaborations are supported by CSDL (Complex Systems Design Lab) project of the Pôle de Compétitivité SYSTEM@TIC PARIS-REGION (11/2009-10/2012). Among the involved industrial partners, we can mention Dassault Aviation, EADS, EDF, MBDA and Renault. The goal of the project is the development of a scientific platform of decisional visualization for preliminary design of complex systems.

The Regularity team collaborates with Supelec (Hana Baili) and with the Department of Mathematics at the University of Nantes (Anne Philippe) in the frame of the DIGITEO ANIFRAC project

Regularity participates in the CSDL project of the Pôle de Compétitivité SYSTEM@TIC PARIS-REGION. The academic partners involved are ECP, Ecole des Mines de Paris, ENS Cachan, INRIA, Supelec.

The Regularity team collaborates with Bar Ilan university on theoretical developments around set-indexed fractional Brownian motion and set-indexed Lévy processes (invitations of Erick Herbin in Israël during five months in 2006, 2007, 2008, 2009 and 2011 and invitation of Prof. Ely Merzbach at Ecole Centrale Paris in 2008, 2009, 2010 and 2011). The PhD thesis of Alexandre Richard is supervised in collaboration by Erick Herbin and Ely Merzbach.

The Regularity team collaborates with Michigan State University (Prof. Yimin Xiao) on the study of fine regularity of multiparameter fractional Brownian motion (invitation of Erick Herbin at East Lansing in 2010).

The Regularity team collaborates with St Andrews University (Prof. Kenneth Falconer) on the study of multistable processes.

The Regularity team collaborates with Acadia University (Prof. Franklin Mendivil) on the study of fractal strings.

Ely Merzbach, from Bar Ilan university (Israel) visited the team for one month. Franklin Mendivil, from Acadia University (Canada), visited the team for one month.

Paul Balança attended to the conference
*Journées de Probabilités 2011*at Nancy and made a presentation on 2-microlocal analysis, mainly focused on results from
.

Alexandre Richard attended to the conference
*Journées de Probabilités 2011*at Nancy and made a presentation on Hölder regularity for set indexed-processes, mainly focused on results from
.

Joachim Lebovits was invited to give a lecture in the mathematical department of University of Vienna (Austria). He made a presentation at the 35th Stochastic Process and their Applications congress in Oaxaca (Mexico).

Jacques Lévy Véhel gave an invited lecture at EPFL (Swizterland).

Erick Herbin was invited to the Israel Mathematical Union 2011 Annual Meeting (Bar-Ilan University, Israel). Talk: "Some recent advances on stochastic 2-microlocal analysis for stochastic processes".

Erick Herbin was invited to the Geometric Functional Analysis & Probability Seminar (Weizmann Institute of Science, Israel) in July, 2011. Talk: "Several characterisations of the set-indexed Lévy processes".

Erick Herbin is member of the IMdR Work Group "Uncertainty and industry".

Erick Herbin is member of the CNRS Research Group GDR Mascot Num, devoted to stochastic analysis methods for codes and numerical treatment.

Erick Herbin is reviewer for Mathematical Reviews (AMS).

Jacques Lévy Véhel is associate editor of the journal Fractals.

Erick Herbin is Director of the Mathematics Department at Ecole Centrale Paris.

Erick Herbin is in charge of the Probability Course at Ecole Centrale Paris (20h).

Erick Herbin is in charge of the Random Modeling Course at Ecole Centrale Paris (30h).

Erick Herbin and Jacques Lévy Véhel are in charge of the Brownian Motion and Stochastic Calculus Course at Ecole Centrale Paris (30h).

Jacques Lévy Véhel gives a course on wavelets and fractals at Ecole Centrale Nantes (8h).

Erick Herbin gives travaux dirigés on Real and Complex Analysis at Ecole Centrale Paris (10h).

Erick Herbin is in charge of the Numerical Simulation Program in the Applied Mathematics option of Ecole Centrale Paris.

Erick Herbin is supervisor of several student's research projects in the field of Mathematics at Ecole Centrale Paris.

Paul Balança gives travaux dirigés on Probability (L3) at Ecole Centrale Paris (9h).

Paul Balança gives travaux dirigés on Real and Complex Analysis (L3) at Ecole Centrale Paris (9h)

Paul Balança gives travaux dirigés on Random Modeling (M1) at Ecole Centrale Paris (20).

Joachim Lebovits gives travaux dirigés on Real and Complex Analysis (L3) at Ecole Centrale Paris (9h).

Joachim Lebovits gives travaux dirigés on Probability (L3) at Ecole Centrale Paris (9h).

Joachim Lebovits gives travaux dirigés on financial mathematics (M1) at Ecole Centrale Paris (15h).

Joachim Lebovits gives travaux dirigés on stochastic calculus (M2) at Ecole Centrale Paris (15h).

Joachim Lebovits supervises students research projects on financial mathematics at Ecole Centrale Paris.

Alexandre Richard gives travaux dirigés on Probability (L3) at Ecole Centrale Paris (9h).

Alexandre Richard gives travaux dirigés on Statistics (L3) at Ecole Centrale Paris (9h).

Alexandre Richard gives travaux dirigés on Random Modeling (M1) at Ecole Centrale Paris (20h).

Alexandre Richard supervises students research projects on probability at Ecole Centrale Paris (approx. 10h).

Alexandre Richard supervises students research projects on economic modelling of the cost and efficiency of a technique of hips resurfacing at Ecole Centrale Paris (approx. 15h).

Benjamin Arras gives travaux dirigés on Probability (L3) at Ecole Centrale Paris (9h).

Benjamin Arras gives travaux dirigés on Real and Complex Analysis (L3) at Ecole Centrale Paris (9h)

Benjamin Arras gives travaux dirigés on stochastic calculus (M2) at Ecole Centrale Paris (15h).