Many phenomena of interest are analyzed and controlled
through graphs or n-dimensional images. Often, these graphs have
an *irregular aspect*, whether the studied phenomenon is of natural
or artificial origin. In the first class, one may cite
natural landscapes, most biological signals and images (EEG, ECG, MR images, ...),
and temperature records. In the second class, prominent examples include financial logs and TCP traces.

Such irregular phenomena are usually not adequately described by purely deterministic models, and a probabilistic ingredient is often added. Stochastic processes allow to take into account, with a firm theoretical basis, the numerous microscopic fluctuations that shape the phenomenon.

In general, it is a wrong view to believe that
irregularity appears as an epiphenomenon, that is
conveniently dealt with by introducing randomness. In many situations, and
in particular in some of the examples
mentioned above, irregularity is a core
ingredient that cannot be removed without destroying the
phenomenon itself. In some cases, irregularity is even a
necessary condition for proper functioning.
A striking example is that of ECG: an ECG is inherently irregular, and, moreover, in a mathematically precise
sense, an *increase* in its regularity is strongly correlated with a *degradation* of its condition.

In fact, in various situations, irregularity is a crucial feature that can be used
to assess the behaviour of a given system. For instance,
irregularity may the result of two or more sub-systems that
act in a concurrent way to achieve some kind of equilibrium.
Examples of this abound in nature
(*e.g.* the sympathetic and parasympathetic systems in the regulation of the heart). For artifacts, such as financial logs and TCP traffic, irregularity is in a sense
an unwanted feature, since it typically makes regulations more complex. It is
again, however, a necessary one. For instance, efficiency in financial markets requires a constant flow of information among agents, which manifests itself
through permanent fluctuations of the prices: irregularity just reflects the evolution of this information.

The aim of *Regularity* is a to develop a coherent set of methods allowing to model such “essentially
irregular” phenomena in view of managing the uncertainties entailed by their irregularity.

Indeed, essential irregularity makes it more to difficult to study phenomena in terms of their description,
modeling, prediction and control. It introduces *uncertainties* both in
the measurements and the dynamics. It is, for instance, obviously easier to predict the short
time behaviour of a smooth (*e.g.*

J. Lévy Véhel was a ﬁnalist at the 2013 Humies competition in Amsterdam.

The modeling of essentially irregular phenomena is an important challenge, with an emphasis on understanding the sources and functions of this irregularity. Probabilistic tools are well-adapted to this task, provided one can design stochastic models for which the regularity can be measured and controlled precisely. Two points deserve special attention:

first, the study of regularity has to be *local*. Indeed, in most applications, one will want to act on a system based on local temporal or spatial information. For instance, detection of arrhythmias in ECG
or of krachs in financial markets should be performed in “real time”, or, even better, ahead of time. In this sense, regularity is a *local* indicator of the *local* health of a system.

Second, although we have used the term “irregularity” in a generic and somewhat vague sense, it seems obvious that, in real-world phenomena, regularity comes in many colors, and a rigorous analysis should distinguish between them. As an example, at least two kinds of irregularities are present in financial logs: the local “roughness” of the records, and the local density and height of jumps. These correspond to two different concepts of regularity (in technical terms, Hölder exponents and local index of stability), and they both contribute a different manner to financial risk.

In view of the above, the *Regularity* team focuses on the design of methods that:

define and study precisely various relevant measures of local regularity,

allow to build stochastic models versatile enough to mimic the rapid variations of the different kinds of regularities observed in real phenomena,

allow to estimate as precisely and rapidly as possible these regularities, so as to alert systems in charge of control.

Our aim is to address the three items above through the design of mathematical tools in the field of probability (and, to a lesser extent, statistics), and to apply these tools to uncertainty management as described in the following section. We note here that we do not intend to address the problem of controlling the phenomena based on regularity, that would naturally constitute an item 4 in the list above. Indeed, while we strongly believe that generic tools may be designed to measure and model regularity, and that these tools may be used to analyze real-world applications, in particular in the field of uncertainty management, it is clear that, when it comes to control, application-specific tools are required, that we do not wish to address.

The research topics of the *Regularity* team can be roughly divided into two strongly interacting axes, corresponding to two complementary ways of studying regularity:

developments of tools allowing to characterize, measure and estimate various notions of local regularity, with a particular emphasis on the stochastic frame,

definition and fine analysis of stochastic models for which some aspects of local regularity may be prescribed.

These two aspects are detailed in sections and below.

**Fractional Dimensions**

Although the main focus of our team is on characterizing *local*
regularity, on occasions, it is interesting to use a *global*
index of regularity. Fractional dimensions provide such an index.
In particular, the *regularization dimension*, that was defined
in , is well adapted to the study stochastic processes, as its
definition allows to build robust estimators in an easy way.
Since its introduction, regularization dimension has been used by various teams
worldwide in many different applications including the characterization of certain stochastic
processes, statistical estimation,
the study of mammographies or galactograms for breast
carcinomas detection,
ECG analysis for the study of ventricular arrhythmia,
encephalitis diagnosis from EEG, human skin analysis,
discrimination between the nature of radioactive contaminations,
analysis of porous media textures,
well-logs data analysis,
agro-alimentary image analysis, road profile analysis, remote sensing,
mechanical systems assessment, analysis of video games, ...(see http://

**Hölder exponents**

The simplest and most popular measures of local
regularity are the pointwise
and local Hölder exponents. For a stochastic process

and

Although these quantities are in general random, we will omit as is customary
the dependency in

The random functions

The pointwise Hölder exponent is a very versatile
tool, in the sense that the set of pointwise Hölder functions of
continuous functions is quite large (it coincides with the set of
lower limits of sequences of continuous functions ). In this sense,
the pointwise exponent is often a more precise tool
(*i.e.* it varies in a more rapid way)
than the local one, since local Hölder functions are always lower semi-continuous.
This is why, in particular, it is
the exponent that is used as a basis ingredient in multifractal
analysis (see section ). For certain classes of stochastic
processes, and most notably Gaussian processes, it has the remarkable
property that, at each point, it assumes an almost sure value .
SRP, mBm, and processes of this kind (see sections and
) rely on the sole use
of the pointwise Hölder exponent for prescribing the regularity.

However,

Another, related, drawback of the pointwise exponent is that it is
not stable under integro-differentiation, which sometimes makes
its use complicated in applications. Again, the local exponent provides
here a useful complement to

Both exponents have proved useful in various applications, ranging from image denoising and segmentation to TCP traffic characterization. Applications require precise estimation of these exponents.

**Stochastic 2-microlocal analysis**

Neither the pointwise nor the local exponents give a complete characterization of the local regularity, and, although their joint use somewhat improves the situation, it is far from yielding the complete picture.

A fuller description of local regularity is provided by the
so-called *2-microlocal analysis*, introduced by J.M. Bony
. In this frame, regularity
at each point is now specified by two indices, which makes the analysis
and estimation tasks more difficult. More precisely,
a function *2-microlocal space*

for all *2-microlocal
spectrum*. This spectrum provide a wealth of information on the local
regularity.

In , we have laid some foundations for a stochastic version of 2-microlocal analysis. We believe this will provide a fine analysis of the local regularity of random processes in a direction different from the one detailed for instance in .We have defined random versions of the 2-microlocal spaces, and given almost sure conditions for continuous processes to belong to such spaces. More precise results have also been obtained for Gaussian processes. A preliminary investigation of the 2-microlocal behaviour of Wiener integrals has been performed.

**Multifractal analysis of stochastic processes**

A direct use of the local regularity is often fruitful in applications.
This is for instance the case in RR analysis or terrain
modeling. However, in some situations,
it is interesting to supplement or replace it by a more global
approach known as *multifractal analysis* (MA). The idea behind
MA is to group together all points with same regularity (as measured
by the pointwise Hölder exponent) and to measure the “size” of
the sets thus obtained , , . There are mainly two ways to do so, a geometrical
and a statistical one.

In the geometrical approach, one defines the
*Hausdorff multifractal spectrum* of a process or function

The statistical path to MA is based on the so-called
*large deviation multifractal spectrum*:

where:

and *i.e.*:

Here,

The large deviation spectrum is typically easier to compute and to estimate than the Hausdorff one. In addition, it often gives more relevant information in applications.

Under very mild conditions (*e.g.* for instance, if
the support of *Legendre multifractal spectrum*. To do so,
one basically interprets the spectrum

with the convention

The Legendre multifractal spectrum of

To see the relation between

where *weak multifractal
formalism* holds, *i.e.* *strong
multifractal formalism*.

Multifractal spectra subsume a lot of information about the distribution of the regularity, that has proved useful in various situations. A most notable example is the strong correlation reported recently in several works between the narrowing of the multifractal spectrum of ECG and certain pathologies of the heart , . Let us also mention the multifractality of TCP traffic, that has been both observed experimentally and proved on simplified models of TCP , .

**Another colour in local regularity: jumps**

As noted above, apart from Hölder exponents and their generalizations,
at least another type of irregularity may sometimes be observed on
certain real phenomena: discontinuities, which occur for instance
on financial logs and certain biomedical signals. In this frame, it is of
interest to supplement Hölder exponents and their extensions with (at least) an additional
index that measures the local intensity and size of jumps. This is a topic we
intend to pursue in full generality in the near future. So far, we have developed an approach
in the particular frame of *multistable processes*. We refer to section
for more details.

The second axis in the theoretical developments of the *Regularity* team aims at defining and studying stochastic processes for which various aspects of the local regularity may be prescribed.

**Multifractional Brownian motion**

One of the simplest stochastic process for which some kind of control over the Hölder exponents is possible is probably fractional Brownian motion (fBm). This process was defined by Kolmogorov and further studied by Mandelbrot and Van Ness, followed by many authors. The so-called “moving average” definition of fBm reads as follows:

where

Although varying

It is possible to generalize fBm to obtain a Gaussian process for which the pointwise Hölder exponent
may be tuned at each point: the *multifractional Brownian motion (mBm)* is such
an extension, obtained by substituting the constant parameter *regularity function*

mBm was introduced independently by two groups of authors:
on the one hand, Peltier and Levy-Vehel defined the mBm

On the other hand, Benassi, Jaffard and Roux defined the mBm from the harmonizable representation of the
fBm, *i.e.*:

where

The Hölder exponents of the mBm are prescribed almost surely:
the pointwise Hölder exponent is

The fact that the local regularity of mBm
may be tuned *via* a functional parameter has made it a useful
model in various areas such as finance, biomedicine,
geophysics, image analysis, ....
A large number of studies have been devoted worldwide to its mathematical properties,
including in particular its local time. In addition,
there is now a rather strong body of work dealing the estimation of its
functional parameter, *i.e.* its local regularity. See http://

**Self-regulating processes**

We have recently introduced another class of stochastic models, inspired by mBm,
but where the local regularity, instead of being tuned “exogenously”, is
a function of the amplitude. In other words, at each point *self-regulating process* (SRP).
The particular process obtained by adapting adequately mBm is called
the self-regulating multifractional process . Another instance is given by
modifying the Lévy construction of Brownian motion .
The motivation for introducing self-regulating processes is based on the following general fact: in nature, the local regularity of a phenomenon is often related to its amplitude.
An intuitive example is provided by natural terrains: in young mountains, regions
at higher altitudes are typically more irregular than regions at lower altitudes.
We have verified this fact experimentally on several digital elevation models
. Other natural phenomena displaying a relation between
amplitude and exponent include temperatures
records and RR intervals extracted from ECG .

To build the SRMP, one starts from a field of fractional Brownian motions

the affine rescaling between

where

An example of a two dimensional SRMP with function

We believe that SRP open a whole new and very promising area of research.

**Multistable processes**

Non-continuous phenomena are commonly encountered in real-world
applications, *e.g.* financial records or EEG traces.
For such processes, the information brought
by the Hölder exponent must be supplemented by some measure of
the density and size of jumps. Stochastic processes with jumps,
and in particular Lévy processes, are currently an active area of research.

The simplest class of non-continuous Lévy processes is maybe the one
of stable processes . These are mainly characterized by a parameter
*stability index* (

In line with our quest for the characterization and modeling of
various notions of local regularity, we have defined *multistable processes*.
These are processes which are
“locally” stable, but where
the stability index

More formally, a multistable process is a process which is,
at each time

where the limit is understood either in finite dimensional
distributions or in the stronger sense of distributions.
Note

One approach to defining multistable processes is similar to the one
developed for constructing mBm : we consider fields of stochastic processes

A particular class of multistable processes, termed
“linear multistable multifractional
motions” (lmmm) takes the following form , .
Let

where

In fact, lmmm are somewhat more general than said above:
indeed, the couple

**Multiparameter processes**

In order to use stochastic processes to represent the variability of multidimensional phenomena, it is necessary to define extensions for indices in

These works have highlighted the difficulty of giving satisfactory definitions for increment stationarity, Hölder continuity and covariance structure which are not closely dependent on the structure of

A promising improvement in the definition of multiparameter extensions is the concept of *set-indexed processes*. A set-indexed process is a process whose indices are no longer “times” or “locations” but may be some compact connected subsets of a metric measure space. In the simplest case, this framework is a generalization of the classical multiparameter processes : usual multiparameter processes are set-indexed processes where the indexing subsets are simply the rectangles

Set-indexed processes allow for greater flexibility, and should in particular be useful for the modeling of censored data. This situation occurs frequently in biology and medicine, since, for instance, data may not be constantly monitored. Censored data also appear in natural terrain modeling when data are acquired from sensors in presence of hidden areas. In these contexts, set-indexed models should constitute a relevant frame.

A set-indexed extension of fBm is the first step toward the modeling of
irregular phenomena within this more general frame. In , the so-called *set-indexed fractional Brownian motion (sifBm)* was defined as the mean-zero Gaussian process

where

This process appears to be the only set-indexed process whose projection on increasing paths is a one-parameter fractional Brownian motion .
The construction also provides a way to define fBm's extensions on non-euclidean spaces, *e.g.* indices can belong to the unit hyper-sphere of

In the specific case of the indexing collection *multiparameter fractional Brownian motion (MpfBm)*. This process differs from the Lévy fractional Brownian motion and the fractional Brownian sheet, which are also multiparameter extensions of fBm (but do not derive from set-indexed processes).
The local behaviour of the sample paths of the MpfBm has been studied in . The self-similarity index

The increment stationarity property for set-indexed processes, previously defined in the study of the sifBm, allows to consider set-indexed processes whose increments are independent and stationary. This generalizes the definition of Bass-Pyke and Adler-Feigin for Lévy processes indexed by subsets of

Our theoretical works are motivated by and find natural applications to real-world problems in a general frame generally referred to as uncertainty management, that we describe now.

Since a few decades, modeling has gained an increasing part in complex systems design in various fields of industry such as automobile, aeronautics, energy, etc. Industrial design involves several levels of modeling: from behavioural models in preliminary design to finite-elements models aiming at representing sharply physical phenomena. Nowadays, the fundamental challenge of numerical simulation is in designing physical systems while saving the experimentation steps.

As an example, at the early stage of conception in aeronautics, numerical simulation aims at exploring the design parameters space and setting the global variables such that target performances are satisfied. This iterative procedure needs fast multiphysical models. These simplified models are usually calibrated using high-fidelity models or experiments. At each of these levels, modeling requires control of uncertainties due to simplifications of models, numerical errors, data imprecisions, variability of surrounding conditions, etc.

One dilemma in the design by numerical simulation is that many crucial choices are made very early, and thus when uncertainties are maximum, and that these choices have a fundamental impact on the final performances.

Classically, coping with this variability is achieved through *model registration* by experimenting and adding fixed *margins* to the model response.
In view of technical and economical performance, it appears judicious to replace these fixed margins by a rigorous analysis and control of risk. This may be achieved through a probabilistic approach to uncertainties, that provides decision criteria adapted to the management
of unpredictability inherent to design issues.

From the particular case of aircraft design emerge several general aspects of management of uncertainties in simulation. Probabilistic decision criteria, that translate decision making into mathematical/probabilistic terms, require the following three steps to be considered :

build a probabilistic description of the fluctuations of the model's parameters (*Quantification* of uncertainty sources),

deduce the implication of these distribution laws on the model's response (*Propagation* of uncertainties),

and determine the specific influence of each uncertainty source on the model's response variability (*Sensitivity Analysis*).

The previous analysis now constitutes the framework of a general study of uncertainties. It is used in industrial contexts where uncertainties can be represented by *random variables* (unknown temperature of an external surface, physical quantities of a given material, ... at a given *fixed time*). However, in order for the numerical models to describe with high fidelity a phenomenon, the relevant uncertainties must generally depend on time or space variables.
Consequently, one has to tackle the following issues:

*How to capture the distribution law of time (or space) dependent parameters,
without directly accessible data?*
The distribution of probability of the continuous time (or space) uncertainty sources must describe the links between variations at neighbor times (or points).
The local and global regularity are important parameters of these laws, since it describes how the fluctuations at some time (or point) induce fluctuations at close times (or points).
The continuous equations representing the studied phenomena should help *to propose models for the law of the random fields*.
Let us notice that interactions between various levels of modeling might also be used to derive distributions of probability at the lowest one.

The navigation between the various natures of models needs a kind of *metric* which could *mathematically describe the notion of granularity or fineness* of the models.
Of course, the local regularity will not be totally absent of this mathematical definition.

All the various levels of conception, preliminary design or high-fidelity modelling, require *registrations by experimentation* to reduce model errors.
This *calibration* issue has been present in this frame since a long time, especially in a deterministic optimization context. The random modeling of uncertainty requires the definition of a systematic approach.
The difficulty in this specific context is: statistical estimation with few data and estimation of a function with continuous variables using only discrete setting of values.

Moreover, a multi-physical context must be added to these questions. The complex system design is most often located at the interface between several disciplines. In that case, modeling relies on a coupling between several models for the various phenomena and design becomes a *multidisciplinary optimization* problem. In this uncertainty context, the real challenge turns robust optimization to manage technical and economical risks (risk for non-satisfaction of technical specifications, cost control).

We participate in the uncertainties community through several collaborative research projects. As explained above, we focus on essentially irregular phenomena, for which irregularity is a relevant quantity to capture the variability (e.g. certain biomedical signals, terrain modeling, financial data, etc.). These will be modeled through stochastic processes with prescribed regularity.

**ECG analysis and modelling**

ECG and signals derived from them are an important source of
information in the detection of various pathologies, including *e.g.* congestive heart failure, arrhythmia and sleep apnea. The fact that the
irregularity of ECG bears some information on the condition of the heart
is well documented (see *e.g.* the web resource http://

First, we use refined regularity characterizations, such as the regularization dimension, 2-microlocal analysis and advanced multifractal spectra for a more precise analysis of ECG data. This requires in particular to test current estimation procedures and to develop new ones.

Second, we build stochastic processes that mimic in a faithful way some features of the dynamics of ECG. For instance, the local regularity of RR intervals, estimated in a parametric way based on a modelling by an mBm, displays correlations with the amplitude of the signal, a feature that seems to have remained unobserved so far . In other words, RR intervals behave as SRP. We believe that modeling in a simplified way some aspects of the interplay between the sympathetic and parasympathetic systems might lead to an SRP, and to explain both this self-regulating property and the reasons behind the observed multifractality of records. This will open the way to understanding how these properties evolve under abnormal behaviour.

**Pharmacodynamics and patient drug compliance**

Poor adherence to treatment is a worldwide problem that threatens
efficacy of therapy, particularly in the case of chronic
diseases. Compliance to pharmacotherapy can range from *i.e.*, drugs are administered at a fixed
dosage. However, the drug concentration-time curve is often influenced
by the random drug input generated by patient poor adherence behaviour,
inducing erratic therapeutic outcomes. Following work already
started in Montréal , , we consider stochastic processes induced by
taking into account the random drug intake induced by various
compliance patterns. Such studies have been made possible by
technological progress, such as the “medication event monitoring
system”, which allows to obtain data describing the behaviour of
patients.

We use different approaches to study this problem: statistical methods where enough data are available, model-based ones in presence of qualitative description of the patient behaviour. In this latter case, piecewise deterministic Markov processes (PDP) seem a promising path. PDP are non-diffusion processes whose evolution follows a deterministic trajectory governed by a flow between random time instants, where it undergoes a jump according to some probability measure . There is a well-developed theory for PDP, which studies stochastic properties such as extended generator, Dynkin formula, long time behaviour. It is easy to cast a simplified model of non-compliance in terms of PDP. This has allowed us already to obtain certain properties of interest of the random concentration of drug . In the simplest case of a Poisson distribution, we have obtained rather precise results that also point to a surprising connection with infinite Bernouilli convolutions , , . Statistical aspects remain to be investigated in the general case.

FracLab was developed for two main purposes:

propose a general platform allowing research teams to avoid the need to re-code basic and advanced techniques in the processing of signals based on (local) regularity.

provide state of the art algorithms allowing both to disseminate new methods in this area and to compare results on a common basis.

FracLab is a general purpose signal and image processing toolbox based on fractal, multifractal and local regularity methods. FracLab can be approached from two different perspectives:

(multi-) fractal and local regularity analysis: A large number of procedures allow to compute various quantities associated with 1D or 2D signals, such as dimensions, Hölder and 2-microlocal exponents or multifractal spectra.

Signal/Image processing: Alternatively, one can use FracLab directly to perform many basic tasks in signal processing, including estimation, detection, denoising, modeling, segmentation, classification, and synthesis.

A graphical interface makes FracLab easy to use and intuitive. In addition, various wavelet-related tools are available in FracLab.

FracLab is a free software. It mainly consists of routines
developed in MatLab or C-code interfaced with MatLab.
It runs under Linux, MacOS and Windows environments. In addition,
a “stand-alone” version (*i.e.* which does not require
MatLab to run) is available.

Fraclab has been downloaded several thousands of times in the last years
by users all around the world. A few dozens
laboratories seem to use it regularly, with more than four hundreds registered users.
Our ambition is to make it the
standard in fractal softwares for signal and image processing
applications. We have signs that this is starting to become
the case. To date, its use has been acknowledged in roughly three hundreds
research papers in various areas such as astrophysics, chemical engineering,
financial modeling, fluid dynamics, internet and road traffic analysis, image and signal processing,
geophysics, biomedical applications, computer science, as well as in mathematical studies in analysis and
statistics (see http://

Last year, we produced a major release of FracLab (version 2.1). This year, we corrected a number of bugs.

From a theoretical perspective to more concrete applications, fractional Brownian motion (fbm) is a fruitful and rich mathematical object. From its stochastic analysis, initiated during the nineties, several theories of stochastic integration have emerged so far. Indeed, fbm is, in general, not a semimartingale neither a Markov process. These theories rely on different properties of the stochastic integrator process and are then of different natures. Despite the quite large number of these strategies, we can group them into two fundamentally distinct categories: the pathwise and the probabilistic approaches. The probabilistic one requires highly evolved stochastic analysis tools. Indeed, the Malliavin calculus as well as Hida's distribution theory have been used in order to define stochastic integration with respect to fractional Brownian motion ( , ) and more general Gaussian processes (). Moreover, fbm belongs to an important class of stochastic processes, namely, the Hermite processes. This class appears in non-central limit theorems for processes defined as integrals or partial sums of non-linear functionals of stationary Gaussian sequences with long-range dependence (see ). They admit the following representation for all

where

**Theorem**: Let

Then, we have in

where

with

Moreover, in the same setting, we obtain the following "isometry" result for the Rosenblatt noise integral of sufficiently "good" integrand processes:

**Theorem**: Let

where

where

Finally, in the last section of , we compare our approach to the one of . More specifically, we prove that the stochastic integral with respect to the Rosenblatt process built using Malliavin calculus corresponds with the Rosenblatt noise integral when both of them exist.

**Proposition**: Let

Then,

where

Under the previous hypothesis, the local regularity of the mBm at

This result has been recently improved in , observing that the pointwise exponent can even be random under some assumptions on

Therefore, the main goal of this work was to obtain a more complete characterization of the geometry of the general mBm. We have first focused on the Hölder regularity of the sample paths, using for this purpose a deterministic representation of the fractional Brownian field:

where

where

We have also been able to obtain some uniform lower bounds on the 2-microlocal frontier, which are optimal under some mild assumptions on the Hurst function.

The second direction of our study has concerned the fractal dimension of the graph of the mBm. Interestingly, and on the contrary to fBm, we have to distinguish the Box and Hausdorff dimensions in our result. The first happens to be the easiest one to study and is closely related to the geometry of

where

To study the Hausdorff dimension the graph, we need a slightly different approach which makes use of parabolic Hausdorff dimension. We first define for all *parabolic metric* *parabolic Hausdorff dimension* of

Studying the local Hausdorff dimension of the graph of the mBm, we have proved that with probability one

Even though this result might seem counter-intuitive, it can be checked that it induced the classic equality

Let

Denote by

The well-known Bernstein inequality (1946) states that, for all

In the i.i.d. case, Cramér (1938) has established a large deviation expansion under the condition

where

Bahadur-Rao (1960) proved the following sharp large deviations similar to (). Assume Cramér's condition. Then, for given

where

We present an improvement on Bernstein's inequality. In particular, we establish a sharp large deviation expansion similar to the classical results of Cramér and Bahadur-Rao. The following theorem is our main result.

**Theorem 0.1** Assume Bernstein's condition. Then, for all

where

with

and

and thus

Using structures of Abstract Wiener Spaces and their reproducing kernel Hilbert spaces, we define a fractional Brownian field indexed by a product space

The family of fBm can be considered for the different Hurst parameters as a single Gaussian process indexed by

When looking at the

An important subclass of these processes is formed by processes restricted to indicator functions of subsets of *b)*, besides the inherent interest of studying processes over an abstract space.

To define this field, we used fractional operators on the Wiener space

The advantage of this approach is to allow the transfer of techniques of calculus on the Wiener space to any other linearly isometric space with the same structure (those spaces are called Abstract Wiener Spaces). Using the separability and reproducing kernel property of the Cameron-Martin spaces built from the kernels

For fixed

where

Finally, we looked at the Hölder regularity of the fBf, when the

*In collaboration with K. Falconer, University of St Andrews.*

Self-stabilizing processes are càdlàg processes whose local intensiy of jumpd depend on amplitude. We have investigated two paths to define such processes. The first one is based on a modification of the celebrated Lévy construction of Brownian motion.

The second one starts from a stochastic differentiel equation, and allows one to build Markov processes, a useful feature in applications such as financial modelling , .

*In collaboration with R. Le Guével, University of Rennes.*

As a follow-up to the work in we have computed the Hausdorf, large deviation, and Legendre multifractal spectra of multistable Lévy motion. It turns out that the shape of the Hausdorf multifractal spectrum is much more complex than could be expected considering the corresponding spectrum of plain Lévy motion. Also, the large deviation spectrum reveals more information on the fine structure of the process than the Hausdorf one, a situation reminiscent of what has already been observed for the model we have developped previously for TCP traffic ,.

*In collaboration with A. Echelard and A. Philippe, University of Nantes.*

We have shown that various geophysical signals, and in particular temperature records, can be modelled with self-regulating processes as introduced in . For this purpose, we have used an estimator of the self-regulating function proposed in . Such a modelling allows one to gain further insight on the fine structure of the evolution of temperatures.

*In collaboration with A. Echelard.*

We have proposed a new wavelet-based method for signal denoising, that allows one to recover the local Hölder regularity of the original signal under weak assumptions . The algotihm is a modification of the well-known wavelet thresholding procedure, where "small" coefficients are not put to zero, but modified in a way governed by the behaviour of large scale coefficients. This will have applications in the frame of our Tandem project on the analysis of radar images.

The Tandem Project is a consortium involving several industrial companies (e.g. Bull Amesys) and some research laboratories (e.g. CMAP). The aim is to detect landmines from 3D radar images.

Erick Herbin is member of the CNRS Research Groups:

GDR Mascot Num, devoted to stochastic analysis methods for codes and numerical treatment;

GDR Math-Entreprise, devoted to mathematical modeling of industrial issues.

Regularity collaborates with Bar Ilan university on theoretical developments around set-indexed fractional Brownian motion and set-indexed Lévy processes. The PhD thesis of Alexandre Richard is co-supervised by Erick Herbin and Ely Merzbach.

Regularity collaborates with Michigan State University (Prof. Yimin Xiao) on the study of fine regularity of multiparameter fractional Brownian motion.

Regularity collaborates with St Andrews University (Prof. Kenneth Falconer) on the study of multistable processes.

Regularity collaborates with Acadia University (Prof. Franklin Mendivil) on the study of fractal strings, certain fractals sets, and the study of the regularization dimension.

Regularity collaborates with Milan University (Prof. Davide La Torre) on the study of certain economic growth models.

Ely Merzbach (Bar-Ilan University) visited the team for one month.

Benjamin Arras attented the *Journées de Probabilités* at Orléans from 17th june 2013 to 21rst june 2013. During a talk, he presented his research regarding wavelets expansion and Hölderian regularity of stochastic processes belonging to Wiener chaoses.

Benjamin Arras attented classes during the "Ecole de probabilités de Saint-Flour" at Saint-Flour from 10th july 2013 to 21rst july 2013.

Paul Balança attented the *Journées de Probabilités* at Orléans from 17th june 2013 to 21rst june 2013: Presentation on 2-microlocal analysis and Lévy processes.
Paul Balança attented the *7th international conference on Lévy processes* at Wroclaw: Poster on 2-microlocal analysis and Lévy processes.

Alexandre Richard attented the *Journées de Probabilités* at Orléans from 17th june 2013 to 21rst june 2013:
Presentation on fractional Brownian fields in abstract Wiener spaces.

Alexandre Richard attented classes during the "Ecole de probabilités de Saint-Flour" at Saint-Flour from 10th july 2013 to 21rst july 2013.

Xiequan Fan is a reviewer for Mathematical Reviews (AMS).

Jacques Lévy Véhel is associate editor of the journal *Fractals*.

Licence: Erick Herbin, Probability course at Ecole Centrale Paris (20h).

Master: Erick Herbin, Advanced Probability course at Ecole Centrale Paris (30h).

Master: Erick Herbin and Jacques Lévy Véhel, Brownian Motion and Stochastic Calculus course at Ecole Centrale Paris (30h).

Master: Jacques Lévy Véhel teaches a course on Wavelets and Fractals at Ecole Centrale Nantes (8h).

Licence: Benjamin Arras, Analysis, Probability and PDE, 3x10 hours, L3, Ecole Centrale Paris.

Licence: Paul Balança, Analysis, Probability, 2x10 hours, L3, Ecole Centrale Paris.

Master: Benjamin Arras, Brownian motion and Stochastic Calculus, 20 hours, M2, Ecole Centrale Paris.

Master: Paul Balança, Advanced Probability, 18 hours, M1, Ecole Centrale Paris.

PhD in progress : Benjamin Arras, Self-similar processes in higher order chaoses, started in September 2011, supervised by J. Lévy Véhel.

PhD in progress : Paul Balança, Stochastic 2-microlocal analysis of SDEs, started in October 2010, supervised by Erick Herbin.

PhD in progress : Alexandre Richard, Regularity of set-indexed processes and construction of a set-indexed process with varying local regularity , started in October 2010, supervised by Erick Herbin and E. Merzbach.