Biology and medicine constitute a growing source of challenging problems in applied mathematics and computer science. As more and more complex issues are addressed, with measurement devices becoming more precise and data collections larger, there is a strong need for new models and methods, as well as for techniques that precisely adapt to the need of patients, medical practitioner and biologists.

Many different approaches may be used to tackle these problems. The one we favour is based on the following ingredients: we consider biological and medical data as
*signals or images*, that we
*model in a stochastic frame*. Our emphasis in building the models and studying our signals is on
*regularity analysis*. Let us develop briefly these points.

**Signal and image**processing is ubiquitous in the biological and medical fields, ranging from the analysis of EEG, ECG, and all types of medical images (CT, MRI, ...) to the study of
DNA sequences and pharmacodynamics. New or improved data acquisition methods are constantly developed (
*e.g.*digital mammography, functional MRI). They provide a wealth of information that calls for advanced processing techniques.

**Stochastic models**: most of the times, a successful signal analysis will use some knowledge based on a model of the data at hand. For instance, automated analysis of R-R intervals in
ECG (used to detect various pathologies) is more efficient if one has at least a rough idea of the complex nonlinear interactions between the two, competing, sympathetic and parasympathetic
regulations. Modeling is a difficult task in these fields, and one must often introduce randomness in order to account for the too many variables involved: stochastic processes are then
natural and commonly used tools. In some cases, it is practical to use an approach based on population evolution, where the global behaviour emerges as the interaction of many individuals
following simple rules. Furthermore, controlling (in an automatic or interactive way) complex stochastic models often necessitates sophisticated stochastic optimisation techniques like
evolutionary algorithms.

**Regularity analysis**: models obtained in this way typically exhibit strong local irregularity, mirroring the one present in actual biological data. As a matter of fact, irregularity
is not only omnipresent in these data, but it also often bears discriminant information: for instance, a healthy electrocardiogram is more irregular (in a precise mathematical sense) than a
pathological one, with an increase in regularity strongly correlated with the severity of the pathology. Developing methods that allow to measure, analyse and control the local regularity
of the data is thus of great importance.

Biological and medical data analysis using stochastic signal analysis is a field where many teams contribute worldwide. See for instance
http://

We focus on a model-based approach, where the models are either parametric stochastic processes or population-based evolutionary processes.

We mainly analyse the data based on local regularity. The relevance of this approach is testified by the already large number of studies dealing with this aspect (see
*e.g. Nature 399, 461 - 465, 1999*). We have an expertise in developing methods for (multi-)fractal signal analysis in a general frame, based on firm theoretical bases and efficient
algorithmic developments. In particular, we try to go beyond the way (multi-)fractal analysis is usually performed in these fields,
*i.e.*measuring the regularity and using it to classify the data or detect pathologies. Our aim is to build models explaining the
*sources*of multifractality. Very few such attempts have been made in biology and medicine. Understanding the mechanisms leading to multifractality is important to correctly interpret
its functional purpose or,
*e.g.*, its relation to various pathologies.

APIS also continues developing freewares, most notably FRACLAB (a matlab/scilab toolbox for 1D and 2D signal processing).

Our team has strong collaborations with IrCcyn in Nantes, with French universities: Orsay (LRI), Calais (LIL), Clermont-Ferrand, and with several foreign universities and research centers: University of St-Andrews (Scotland), CRM Montréal (Canada), Impan (Poland), University of California at Riverside, and Acadia University, Canada. The team is involved in the European organisation (former Network of Excellence) EVO*.

We have industrial contracts with Dassault Aviation, we are involved in 4 ANR projects (REVES, COPRIN, OPUS and INCALIN), and in 2 project of the System@tic pole (XVISION, EHPOC).

We have undergone a major change in our activity this year since we have moved from Inria Rocquencourt to Futurs Saclay. Our previous team COMPLEX has ceased to exist, and we have now centered our research on applications in biology and medicine. Thus, this year activity report is a transition one, where results from our ancient team COMPLEX are presented along with new ones belonging to APIS.

The research of APIS draws on two areas: fractal analysis and artificial evolution. We pursue theoretical developments in these fields motivated by the needs encountered in the modelling and processing of biological and medical data.

We briefly recall below some basic and advanced concepts in fractal analysis and evolutionary computation, and then indicate how these tools are put to use for our applications.

Fractal analysis was developed in view of studying complex irregular objects. Numerous natural phenomena, in particular in physics, biology and medicine, have been shown to exhibit a fractal behaviour. The study of associated models, when available, has lead to significant progress in the understanding and control of these phenomena, for instance in turbulence analysis, non-linear growth, chemical catalysis and wave propagation in irregular media , .

Our emphasis is on the study of local regularity and multifractal analysis. These areas have both a rich theoretical content and many applications in signal/image analysis , . Multifractality occurs when the local regularity exhibits wild temporal variations, so that the mean behaviour bears little information. A multifractal behaviour often emerges as the result of the complex interactions of a large number of elements, each of which acting in a relatively simple way. Such a situation is typical in biology.

Many teams worldwide contribute to local regularity and multifractal analysis, both on the theoretical and applied level. Successful applications include ECG, EEG and DNA sequences analysis, turbulence modelling, road and Internet traffic description and financial data analysis.

It is worthwhile noticing the following dichotomy about multifractal phenomena: “natural” multifractality is always a positive quality that constitutes an efficient answer to some functional constraints. Examples include the organisation of the blood and air flows in the lungs, the geometry of tree branches and many more. In contrast, multifractality of artifacts often constitutes an unwanted complication: for instance, it worsens the behaviour of queues in TCP traffic and makes financial assets management more complex.

As mentioned previously, a precise view on the mechanisms leading to multifractality is important if one wants to understand the purposes it serves and how it is modified in response to external changes or in case of abnormal behaviour. Multifractal models are largely yet to be developed in the biological and medical fields, a gap APIS is trying to fill. Progress is also needed in the definition of new and refined ways of measuring the local regularity, adapted to the characterisation of biological signals. Finally, we address the question of numerical estimation of (multi-)fractal quantities, an area where there is still room for a lot of improvement.

The transposition of Darwin's theory into computers consists in roughly imitating with programs the capability of a population of living organisms to adapt to its environment with
selection/reproduction mechanisms. Since about forty years, various stochastic optimisation methods have been based on this principle.
*Artificial Darwinism*or
*evolutionary algorithms*is a generic name for these techniques, among which
*genetic algorithms*are the most well-known.

The common components of these techniques is
*populations*(that represent for example points of a search space) that evolve under the action of stochastic operators. Evolution is usually organised into
*generations*and copy in a very simplified way natural genetics. The engine of this evolution is made of
*selection*– based on a measurement of the quality of an individual with respect to the problem to be solved – and of
*genetic operators*, usually
*mutation*and
*crossover*or
*recombination*, that produce individuals of a new generation.

According to some conditions, that tune the relative importance of the various components of the evolution mechanism, it has been mathematically proved that such a complex stochastic process
converges toward what is wished, i.e. a limit distribution concentrated on the global optimum of the considered search space

Artificial Darwinism can be considered as a model of complex natural behaviour: a population of interacting agents obey simple rules and produce a global output, often unpredictable or irregular (sometimes proved to be fractal) – particularly if the selective pressure, i.e. the function that is optimised, is irregular or imprecisely known.

Population-based and evolutionary algorithms are investigated in APIS as models of complex biological process, as a mean to “explain” observed irregularity. We also consider these techniques in a more classical way in other applications (agro-alimentary or medical ones) that is, as powerful stochastic search tools: irregular data and signal analysis often necessitates the control and optimisation of complex models (parameters optimisations, inverse problems resolution). A precise understanding and control of these optimisation algorithms on irregular problems as well as the design of adapted schemes for biology are topics we explore in APIS.

Studying the local regularity of signals provides relevant tools for their processing. It leads for example to a set of methods for irregular image analysis, including edge detection, texture segmentation, denoising and interpolation . Edges, for instance, are found as points with low local regularity.

Such tools have proved to be efficient for analysing biological data, which are often highly irregular. This irregularity contains relevant information, and precise theoretical notions have to be defined to measure its various aspects. Associated estimation methods must also be developed for practical data analysis.

Once the local regularity has been defined and measured, it may be used through a multifractal analysis: the general idea is to characterise the “size” of the sets of points with a given
regularity. This may be done in a geometrical way, by computing a Hausdorff dimension, or in a statistical fashion, through an estimation of the probability, at a given resolution, to find a
point with given regularity. The graphs plotting the dimensions or probabilities as functions of the regularity are called
*multifractal spectra*
,
,
. These subsume a lot of information about the distribution of the regularity, that has proved useful in various
situations. A most notable example is the strong correlation reported recently in several works between the narrowing of the multifractal spectrum of ECG and certain pathologies of the heart.
Further work in this area that we are pursuing include the definition of multifractal spectra specifically adapted to measure relevant features of biological data, and the development of more
robust estimation methods.

Stochastic modelling is a well-adapted tool in the biological and medical fields because it is generally impossible to describe accurately all the parameters and interactions that come into
play. Our objective is to define and study stochastic processes whose
*regularity, dependence*and
*jump*properties match those of biological data. Strong, long-term dependence has been shown to be present in,
*e.g.*, DNA sequences, among many other data. Such correlations need to be accounted for by specific stochastic models. Jumps are present for instance in epileptic EEG traces, and
designing processes whose jump intensity can be tuned precisely is necessary for a faithful modelling.

A simple process that allows to control the local regularity is the
*multifractional Brownian motion*(mBm)
,
,
, that generalises the well known fractional Brownian motion (fBm). The local regularity of mBm may be tuned
*via*a functional parameter. We have started to check that this is a useful feature for biological and medical data modelling: indeed, most medical images (
*e.g.*echographies, mammographies), in particular, are highly textured, with different tissues exhibiting different textures and thus different sets of Hölder exponents. We also apply this
kind of model to ECG data.

More complex processes are needed if one wants to also account for jumps in the traces. We are currently building a family of stochastic processes where both the local regularity and the
local intensity of jumps may be prescribed at each point. These
*multistable multifractional processes*(mmp)
are expected to provide relevant models for EEG and other phenomena displaying a time-varying discontinuity
structure. Work on mmp in the near future will focus on path synthesis, extension to higher dimensions, parameter estimation and the study of their multifractal structure.

Another area of research that seems promising is that of
*set-indexed processes*. A set-indexed process is a process whose parameter is no longer “time” or “location” but may be a compact connected set. This allows for greater flexibility, and
should in particular be useful for the modelling of censored data. This situation occurs frequently in biology and medicine, since, for instance, data may not be constantly monitored. We have
recently defined set-indexed fractal processes, whose properties we are currently investigating.

Since a few decades, modelling has gained an increasing part in complex systems design in various fields of industry such as automobile, aeronautics, energy, etc. Nowadays, the challenge of
numerical simulation is designing physical systems saving the experimentation steps. Industrial design involves several levels of modelling: from behavioural models in preliminary design to
finite-elements models aiming to represent to most sharply physical phenomena. At each of these levels, modelling requires control of uncertainties due to simplifications of models, numerical
errors, data imprecisions, variability of surrounding conditions, etc. In a classical view, coping with this variability is achieved from
*model registration*by experimentation and fixed
*margin*added to the model response.

For a goal of technical and economical performance, it appears judicious to include this margin definition in a more rigorous frame of risk control. In other words, a probabilistic vision of uncertainties should provide decision criteria adapted to management of unpredictability inherent to design issues.

The first requirement analysis in terms of management of uncertainties led to a deployment strategy of reliability methods. Relying on setting up probabilistic decision criterion (translation in probabilistic terms...), it is composed of the three following steps:

build a probabilistic description of fluctuations about model's parameters (
*Quantification*of uncertainty sources),

deduce the implication of these distribution laws on model's response (
*Propagation*of uncertainties),

and determine the specific influence of each uncertainty source on model's response variability (
*Sensitivity Analysis*).

The three following points are prerequisites for deployment of the previous analysis:

How to capture the distribution law of model's parameters, without directly accessible data ? Their access through another modelling chain may require development of inverse methods for quantification of sources.

Preferentially, we are interested in generic propagation methods, i. e. which are not specifically attached to the considered model. The usual methods are effective when model complexity allows many solicitation (fast model). However, in the case of finest modelling, we are often facing CPU costly models (aerodynamics, structural mechanics, ...).

All the various levels of conception, preliminary design or sharp modelling, require registrations by experimentation to reduce model errors. This issue has been present in this frame since a long time and it now involves the definition of a systematic approach, particularly in the statistical uncertainty context.

Moreover, a multi-physical context must be added to these difficulties. The complex system design is most often located at the interface between several disciplines. In that case, modelling
relies on a coupling between several models for the various phenomena and design becomes a
*multidisciplinary optimisation*problem. In this uncertainty context, the real challenge turns robust optimisation to manage technical and economical risks (risk for non-satisfaction of
technical specifications, cost control).

Evolutionary algorithms have been successfully used in optimisation problems related to genome structure and sequence alignments, protein folding, biological data analysis, feature selection, and many other problems related to molecular biology. Evolutionary image and signal processing are also very dynamic research areas. Both topics are well represented in the evolutionary computation domain: many workshops, conferences sessions in all big international conferences (CEC, GECCO, EVO*), special issues of journals, or books, are dedicated to bio-informatics and signal/image processing. However there is still a strong demand in theory, especially regarding the behaviour on “difficult” optimisation landscapes (like the one related to cochlear implants fitting, the HEVEA project), and for the design of new schemes. Having more efficient and more adaptive algorithms opens the way to new applications.

Our research in this domain is organised in the three following directions:

**The behaviour of simple evolutionary models on irregular functions:**

Our approach to this question is based on regularity analysis. A first analysis, developed in
has established quantitative results using a very simplified model of genetic algorithm (the canonical GA). If
the function to be optimised
fis supposed to be the sampling at resolution
of a Hölder function (this hypothesis is always valid, even if it happens that the corresponding uni-dimensional underlying function
Fdoes not reflect in a simple way the behaviour of the fitness function), we have shown that an adequate tuning of parameters
l,
p_{m}(mutation probability), and
p_{c}(crossover probability) tends to facilitate the job of the genetic algorithm, and subsequently improves its performances. This analysis yield also an
*a posteriori*validation procedure of the optimisation results of a genetic algorithm (see
and
).

Then, by considering local Hölder exponents and another simplified model, the (1+1)ES (i.e. a population size of 1 and a mutation only algorithm) on a continuous search space, we have proposed an adaptive mutation with respect to local regularity of the function .

Our current work is centered on adaptive operators, in order to build and evaluate genetic operators that adapt to local regularity of the function to be optimised, which is to our mind a key point for applications in biology where irregularity has a discriminant meaning.

**Cooperation-coevolution and Parisian approach:**

Within an evolutionary algorithm (EA) it is often possible to formulate the resolution of a problem as a collective task: the searched solution is thus built from the whole evolved
population and not anymore as the only best individual of the final population of an EA. The
*Parisian approach*, proposed and developed in the FRACTALES and COMPLEX teams since 1999, is based on the capability of EAs to draw a population in quasi-optimal areas of the search
space. The idea is thus to build a search landscape and an evolution process where the whole population (or at least a large part of it) represents the searched solution. Individuals
collectively correspond to a potential solution to the considered problem. A population is a society that commonly build the searched solution: this is a cooperative co-evolution.

Theoretical analysis, specification and tuning of such algorithms is of course even more complex than for the the classical evolutionary approaches, and population diversity becomes a crucial factor. Additionally, it is not always simple to split a problem into interconnected sub-problems adapted to a Parisian approach, but the computational gain is very important. This has been made evident in applications like the “fly algorithm” in stereo-vision or in 3D tomographic reconstruction.

The study of evolution models on functions of controlled regularity can be extended to the Parisian model. Local irregularity has obviously an important influence on the population diversity. These are points we are currently exploring. Our studies are also be based on test-functions specific for cooperation-coevolution that we recently proposed .

We address new applications in biology with Parisian cooperative-coevolution methods. We also explore hybridisations with non-evolutionary cooperative methods like Ant Colony Optimisation (ACO). In the framework of our collaboration with the Nottingham university (Gabriela Ochoa, Edmund Burke), we consider an application of these techniques to the protein folding problem.

**Interactive evolution:**

This research topic is rapidly growing. It is actually a convenient mean to optimise what cannot be expressed using a mathematical formula or an algorithm, and what depends from a
subjective judgment. First works in this domain were oriented toward artistic creation

For instance, we developed an interactive multifractal image denoising application (currently available in Fraclab): multifractal denoising is indeed a versatile and efficient technique, but depends on parameters whose setting is not simple. Additionally, the evaluation of a good denoising is strongly dependent on the end-user as well as the application framework. Signal-to-noise ratio is unable to reflect all the subtle components of a human expert appreciation on a denoising result. The artificial evolution framework allows to introduce human evaluation in the algorithmic loop, and to cope with human judgment irregularity (or even inconsistency).

Of course efficient interactions (limited by the user fatigue) and diversity control are important in this framework. This lead us to experiment interactive Parisian model, with an application to data retrieval (for the Novartis-Pharma company) .

We do think that interactive evolution has a large application field in biology and medicine especially when sophisticated data models are involved (an example related to cochlear implants optimisation has been developed within the HEVEA project), and we intend to develop methods able to adapt to irregular subjective judgment of one or several end-users.

In collaboration with IrCcyn, INRA.

Multifractal analysis allows to build efficient algorithms for the classification, segmentation, and denoising of irregular images. It is thus well adapted to the study of highly textured images as the ones encountered in biology and medicine.

A typical application we are currently involved in is made in collaboration with INRA-Nantes within a VANAM project, funded by the Pays-de-Loire region. It consists in analysing enzymatic decomposition of sugar beet pulp. Agro-alimentary industries produce an important quantity of vegetal loss, and current research is oriented toward biological methods for their decomposition and re-use. This necessitates an assessment of the action of various enzymes. We have shown that the regularity of pulp images is highly correlated with the degradation state. This should allow to measure in an easy way the degradation power of a given (combination of) enzyme.

This project is funded by ANR and associates fundamental laboratories (CREA, INRIA, LIP6), specialised laboratories (INRA) and technical centres (ITFF, INBP) performing research on the food technologies selected (cheese ripening and the bread making).

Competitive challenges which agro-alimentary industries are facing are related to quality and sustainability of alimentary products. The aim of the INCALIN project is to build decision support tools for better managing the products quality and, by the way, manufacturing processes. Causal relationships between on the one hand, ingredients, physico-chemical, microbiological characteristics and on the other hand, sensory, nutritional properties, depending on successive process operations are still ill-known in some food technologies (uncertainty of the processes).

The approach developed deals with different type of knowledge in a multimodal context (know-how of operator-experts in terms of formal or informal reasoning, scientific explanation and modelling of the phenomenon, data basis). Among the fragmented knowledge available, the know-how of the operator-experts is probably a key one that should be taken into account. Evolutionary techniques are exploited here to instantiate complex interaction models (Bayesian networks are used in a first experiment) using incomplete expert data sets. Additionally, interactive evolution is considered to take expert judgments into account within an industrial control system.

In collaboration with Pierre Collet (Unviersité de Strasbourg) et Pierrik Legrand (Université de Bordeaux).

This is a collaboration with the ORL service of Avicenne Hospital (Professeur Frachet), the Calais university (LIL) and the society INNOTECH, about interactive optimisation of cochlear implants (a research ministry project, HEVEA = “Handicap : Etude et Valorisation de l'Ecologie Auditive”). Our participation to this project is based on our competences in interactive evolution as well as in regularity analysis of signals.

The aim of the HEVEA project is to contribute to the cochlear implants fitting techniques. Cochlear implants allow totally deaf people to hear again provided their auditory nerve and cochlear are still functional: a computer processes sounds picked up from a microphone, to stimulate directly the auditory nerve through several electrodes inserted inside the cochlea. As one can imagine, there are hundreds of parameters that can be tuned, and in the same time the patient has to learn to “hear” using new information provided to his auditory nerve. The tuning of such a device is thus extremely complex, and highly dependent on the patient. This process is currently done “by hand” by medical practitioners, and looks like an optimisation process based on “trial and error.” This process is so delicate that sometimes, no satisfactory fitting can be found for some patients. Additionally, many users of cochlear implants or hearing aids find that the parameter setting of their device is not perfectly adapted to all situations of their everyday life: different fittings are obviously needed in different environments.

The aim is actually twofold: one is to help the expert find good fittings using an interactive evolutionary algorithm, and another is to integrate into the processor a small signal analysis software that would be able to recognise the sound environment and automatically select a fitting accordingly, among a set of available fittings corresponding to different situations. The approach that has been developed in the HEVEA project is to use IEAs to fit different parameters settings adapted to a set of various sound environment. A basic IEA (developed on a PDA), a sampling module, and a classification module (based on irregularity time-frequency analysis) have been developed. Tests have been performed on volunteering patients with satisfying results.

The study continues on this topic, and a new interactive evolution algorithm is designed for the tuning of other implants parameters. Additionally, we are currently negotiating with the partners of this study and an implant manufacturer to participate to the development of the next generation of implants (having more electrodes and a more sophisticated signal distribution device on them).

In Nuclear Medicine diagnosis, radioactive substances are administered to patients. The concentration of radioactivity in the body is then estimated from the radiations detected by gamma cameras. In order to get an accurate estimation, a three-dimensional tomography is built from two-dimensional scintigraphic images. Some parasitic effects due to scattering and absorption are then to be corrected. Existing analytical and statistical methods are costly and require heavy computation. We are currently developing a Parisian Evolution Strategy in order to reduce the computing cost of reconstruction without degrading the quality of results. Our approach derives from the Fly algorithm which proved successful on real-time stereo image sequence processing , , .

ECG and signals derived from them are an important source of information in the detection of various pathologies, including
*e.g.*congestive heart failure and sleep apnea. The fractality of these data has been reported in numerous works over the past years. Several fractal parameters, such as the box dimension,
the local regularity and the multifractal spectrum have been found to correlate well with the condition of the heart in certain situations. We participate in this research area in two ways.
First, we use refined local regularity characterisations, such as 2-microlocal analysis, and advanced multifractal spectra for a more precise analysis of ECG data. This requires to test current
estimation procedures and to develop new ones. Our preliminary studies show that the local regularity of RR intervals, estimated in a parametric way based on a modelling by an mBm, displayed
correlations with the amplitude of the signal, a feature that seems to have remained unobserved so far. Second, we have started to build stochastic processes that mimic in a much simplified way
some aspects of the sympathetic and parasympathetic systems, and for which we hope it will possible to compute the theoretical local regularity and multifractal spectrum. This may help to
elucidate the profound reasons behind the observed multifractality of traces, and how it evolves under abnormal behaviour.

In collaboration with Pierre Emmanuel Lévy Véhel (Université de Paris 6), Fahima Nekka (Université de Montréal).

Poor adherence to treatment is a worldwide problem that threatens efficacy of therapy, particularly in the case of chronic diseases. Compliance to pharmacotherapy can range from 5%to 90%. This fact renders clinical tested therapies less effective in ambulatory settings. Increasing the effectiveness of adherence interventions has been placed by the World Health Organisation at the top list of the most urgent needs for the health system. In collaboration with the pharmacy faculty of Montréal university, we shall consider the problem of compliance within the context of multiple dosing. Analysis of multiple dosing drug concentrations, with common deterministic models, is usually based on patient full compliance assumption, i.e., drugs are administered at a fixed dosage. However, the drug concentration-time curve is often influenced by the random drug input generated by patient poor adherence behaviour, inducing erratic therapeutic outcomes. Following work already started in Montréal, we consider stochastic processes induced by taking into account the random drug intake induced by various compliance patterns. Such studies have been made possible by technological progress, such as the “medication event monitoring system”, which allows to obtain data describing the behaviour of patients.

The deterministic model describing the evolution of drug concentration can be considered as a “black box.” The efficiency of the drug is usually an output of this model, when the intake is realized as prescribed. To obtain a robust efficiency of the drug, we need to cope with fluctuations of the behaviour of patients. In that view, we propose:

To model the uncertainty of intake with a probability distribution. As in the generic approach which will be studied in EHPOC project, we will face different situations: statistic methods in presence of data, or model approach in presence of qualitative description of the patient behaviour.

To measure the influence of this uncertainty on drug efficiency, by propagating uncertainty through the "black box" model. If the level of modelling allows the use of Monte Carlo methods, a direct probabilistic description of the efficiency can be deduced. Then the quantity and schedule of drug intake could be adapted to insure an efficiency at a given confidence level. However, when the complex phenomena involved in the evolution of concentrations is precisely described in the "black box" model, Monte Carlo methods are often unpractical because too expensive in CPU time. The collaborations COPRIN and OPUS will consider reduced models as a statistical model of the complex one. In that case, the uncertain efficiency will be the result of the uncertainty of intake and the error of model.

FracLab is a general purpose signal and image processing toolbox based on fractal and multifractal methods. FracLab can be approached from two different perspectives:

Fractal analysis: A large number of procedures allows to compute various fractal quantities associated with 1D or 2D signals, such as dimensions, Hölder exponents or multifractal spectra.

Signal processing: Alternatively, one can use FracLab directly to perform many basic tasks in signal processing, including estimation, detection, denoising, modelling, segmentation, classification, and synthesis.

FracLab is not intended to process "fractal" signals (whatever meaning is given to this word), but rather to apply fractal tools to the study of irregular but otherwise arbitrary signals. A graphical interface makes FracLab easy to use and intuitive. In addition, various wavelet-related tools are available in FracLab.

FracLab is a free software. It mainly consists of routines developed in Matlab or C-code interfaced with Matlab and Scilab (a free scientific software package for numerical computations from INRIA). It runs under Linux and Windows environments

The development of FracLab has been continued in 2007 and the new version ( 2.04) now runs under Linux (32 and 64 bits), Windows (32 and 64 bits) and OSX (Intel) environments, either as a Matlab toolbox or as a standalone executable, which does not require Matlab.

The new website of FracLab is visited by roughly 1500 unique visitors every month, coming from all around the world, mainly USA ( 17%) and China ( 12%). We also added interactive demos to show how to get started with FracLab and use it.

Taking into account all the flavors of FracLab, it has been downloaded 4000 times or so this year. Moreover, FracLab is referenced on more and more websites and forums, bringing a lot of visitors. We are also cited on the official Mathworks website

http://

as a solution to calculate fractal dimensions.

In collaboration with Ely Merzbach (Bar Ilan university, Israel).

Fractional Brownian motion has been extended by Herbin and Merzbach (2006) for indices that are subsets of a measure metric space. The set-indexed fractional Brownian motion (sifBm) of
parameter
H(0, 1/2]is defined as the zero-mean Gaussian process
such that

where is a class of subsets of the measure metric space satisfying some assumptions.

When
H= 1/2, the process
is the well-known set-indexed Brownian motion also called white noise, defined by

When the indexing collection
is constituted by intervals
[0,
t]of
, the equation (
) is reduced to

It shows that is the classical one-dimensional fractional Brownian motion.

Projections of the set-indexed fractional Brownian motion on increasing paths were considered in . For any continuous increasing function , the one-parameter process is a time-changed fractional Brownian motion. Conversely, the sifBm was proved to be the only set-indexed process satisfying this property.

Moreover, this characterization by flows allowed to understand the limitation
0<
H<1/2in sifBm's definition, instead of one-parameter fBm's definition for
0<
H<1. The set-indexed fractional Brownian motion can only be defined for
H>1/2if the indexing collection
is totally ordered (see
).

When the indexing collection
is constituted by rectangles
[0,
t]of
, the process
is a multiparameter process, called multiparameter fractional Brownian motion (MpfBm). In
and
, the MpfBm is compared to the two classical extensions of fBm: the Lévy fractional Brownian field
Band the fractional Brownian sheet
. As restriction of the sifBm, the projection of the MpfBm on any increasing path is a time-changed fBm and this property does not hold for the two previous processes. In that view, the
MpfBm can be considered as the good definition for a multiparameter extension of fractional Brownian motion. It is worth to be studied deeply in order to be applied to model multiparameter
phenomena.

As one-parameter fractional Brownian motion, the set-indexed fractional Brownian motion satisfies increment stationarity and self-similarity properties.

In , we proved a stationarity property for the set-indexed fractional Brownian motion

where denotes the set of elements with .

Property (
) is weaker than the classical increment stationarity property for one-parameter fBm since it only concerns marginal
distributions of the whole process. In
, the stationarity definition is strengthened to involve the distribution of the process. It is proved that for any
integer
n, for all
and for all increasing sequences
and
in
,

This so-called
-increments
m-stationarity property is considered as the good generalization of the increment stationarity property for one-parameter processes since the projection of a stationary process on any
flow is one-parameter process with stationary increments.

To consider a self-similarity property for set-indexed processes, we need to assume that the indexing collection is provided with the action of a group
G

such that

where is a surjective function.

In , the set-indexed fractional Brownian motion is proved to be self-similar, i.e.

Moreover it is proved in that the sifBm is the only Gaussian process satisfying the two properties of increment stationarity and self-similarity. Let us emphasize the fact that this complete characterization of the process could be obtained thanks to the new stationarity property. In , such a result failed to be proved. It provides another important justification for sifBm's definition.

In order to simulate an sifBm, we chose the collection of 2D rectangles :
. Then the measure of the symmetric difference of two sets
U= [0,
u_{1}]×[0,
u
_{2}]and
V= [0,
v_{1}]×[0,
v
_{2}]is :

and :

This leads to a simple synthesis algorithm using the Choleski decomposition. The drawback of this method is that it is limited to
90×90trajectories, and with such a collection,
His restricted to the interval
(0, 1/2]. Moreover, this process is defined on sets, and not on every point and therefore, the representation of a sifBm as an image, where the value of
the pixel at the point
(
u
_{1},
u
_{2})is the value of
B_{U}^{H}on the set
U= [0,
u_{1}]×[0,
u
_{2}]may not be suitable. Figure
represents sifBm for various values of
H.

In collaboration with Florian de Vuyst (Ecole Centrale Paris), Gilles Fleury and Emmanuel Vasquez (Supelec)

In the numerical simulation context, the uncertainty management involves a propagation step which quantifies the impact of a given probability distribution of inputs onto the response of a given deterministic model.

More precisely, the variable of interest is linked to uncertain variable by the relation

Z=
G(
X)

where
is a given deterministic function. The uncertainties on the input variable
Xare described by its probability distribution. Consequentely, the model response
Zis also a random variable.

Usually the industrial stakes of numerical simulation are represented by a probabilistic criterion such as confidence interval or probability to overtake a threshold. In the latter case, we aim to estimate

p=
P(
Z>
z
_{min})

or to prove that this quantity is lower than a level. If this level is very small, Monte-Carlo simulations require a large amount of
Gmodel requests, which can lead to impossible computations. One of the solutions to overcome this obstacle is the use of a reduced-order model that is an approximated model of
G. In an uncertainty propagation goal, qualification of the model becomes essential and the approximated model must be a probabilistic model of
G. Candidates can be found in the field of statistical learning, such as Kriging or other kernel methods, but stochastic processes with prescribed regularity could be another field of
investigation.

A PhD thesis is about to begin to investigate the possible contribution of these kind of Gaussian stochastic model to the uncertainty propagation issue.

In collaboration with CEA, Dassault Aviation, EDF, EADS.

A general methodology has been defined to manage uncertainties in the numerical simulation context. An intensive collaboration with R&D entities of industrial compagnies led to a common view of the problem. Particularly, a continuing education program has been elaborated under the labels IMdR (Institut de Maitrise des Risques), SMAI (Société de Mathématiques Appliquées et Industrielles), SFdS (Société Française de Statistiques) and TERATEC.

The uncertainty management methodology has been applied in robust design context. At the early stage of aircraft design, the models involved are very simplified and the geometric and environmental variables are not completely determined. Then, the prescribed performances of the designed aircraft are uncertain and considered as random variables. In , the general methodology is applied to define the global variables of an aircraft satisfying a range performance with a given probability. This study should be improved by the use of reduced-order models to take into account a finer representation of complex physical phenomena.

In collaboration with Pierre Emmanuel Lévy Véhel (Université de Paris 6), Fahima Nekka (Montréal University)

Poor adherence to treatment is a worldwide problem that threatens efficacy of therapy, particularly in the case of chronic diseases. Compliance to pharmacotherapy can range from 5%to 90%. This fact renders clinical tested therapies less effective in ambulatory settings. Increasing the effectiveness of adherence interventions has been placed by the World Health Organisation at the top list of the most urgent needs for the health system. In collaboration with the pharmacy faculty of Montréal university, we are considering the problem of compliance within the context of multiple dosing. Analysis of multiple dosing drug concentrations, with common deterministic models, is usually based on patient full compliance assumption, i.e., drugs are administered at a fixed dosage. However, the drug concentration-time curve is often influenced by the random drug input generated by patient poor adherence behaviour, inducing erratic therapeutic outcomes. Following work already started in Montréal, we are studying stochastic processes induced by taking into account the random drug intake induced by various compliance patterns. Such studies have been made possible by technological progress, such as the “medication event monitoring system”, which allows to obtain data describing the behaviour of patients.

In the preliminary study made this year, we have used the simplest possible law to model random drug intake,
*i.e.*a Poisson law. In other words, the moments of drug intake are assumed to follow a Poisson process. At time
t, the drug concentration thus reads:

where
D,
V_{d},
k_{e}are some constants, and
T_{0}= 0,
T_{1}, ...,
T_{N(
t)}denote the Poisson random instants of drug intake up to time
t. We are interested in the long term behaviour,
*i.e.*
C: =
C(
). Using Campbell theorem, one gets that the characteristic function of
Cis
, where
is a positive constant. Under certain conditions, this allows to show that the distribution of
Cis singular with respect to the Lebesgue measure, and displays a multifractal behaviour.

Moreover, an intriguing feature appears when one discretizes time: assume that the drug intake, instead of occuring at arbitrary times, may only happen at discrete instants with step
1-
a,
a(0, 1). By discretizing in a suitable way the other parameters of the problem,
one gets that the discrete counterpart of
C, say
C_{a}, is an infinite Bernouilli convolution of the form:

where
X_{j}are i.i.d. random variables taking the value 0 with probablity
1-
pand 1 with probability
p,
pbeing equal to
-log(
a)/
. One can show that
C_{a}tends in law to
Cwhen
atends to 1. Such infinite Bernouilli convolutions have been extensively studied in the frame of multifractal analysis. We are currently investigating the properties of
C_{a}
.

A new process, called Multifractional Self Regulating Process (MSRP), has been defined
,
. The idea is to build a process
Zwhose local regularity
_{Z}is a function of the value of the process, i.e such as :

This model is motivated by the observation of natural phenomenon, for which we observe that the regularity depends on the values of the phenomenon. For example, a mountain, whose altitude is rather high, is less regular than a valley, whose altitude is lower. We also noticed that the heart rate tends to be more irregular when the heart beats slower, usually during night.

The construction of such a process uses the Banach fixed point theorem, and is based on a field of fractional Brownian motions. This provides both the existence of the process, and a method of synthesis.

**Definition and main properties**

Let
gbe a
k_{g}-Lipschitz function defined on
with values in
. Let
^{'}(
)and
^{'}(
)two random variables, such that
^{'}(
)<
^{'}(
)
. The stochastic operator
is defined for almost all
by :

where is a field of fractional Brownian motions and is the function .

Under a technical assumption, the Multifractional Self Regulated Process with parameter
g, denoted
Z_{g}is the unique fixed point
Z^{*}of the operator
.

The MSRP is a non Gaussian continuous process, and verifies for almost all
and every
t[0, 1]:

A modified definition of the MSRP enables to give it a deterministic shape,
s(
t), whose effect is adjusted by a mixing parameter
m. It is then denoted
Z_{g}^{(
s)}.

The MSRP can be easily extended to higher dimensions.

**Applications**

A typical example of a process where the amplitude and the regularity are related is the RR intervals,
*i.e.*intervals between two successive heart beats. Our dataset comes from the PhysioNet facility, which provides a number of 24-hour RR interval time series. These were derived from
long-term ECG recordings of adults between the ages of 20 and 50 who have no known cardiac abnormalities. These recordings typically begin and end in the early morning (within an hour or two of
the subject's awakening). We estimated the Hölder exponent of these signals using a method based on generalized quadratic variations, and represented the RRi in blue and their Hölder exponents
in green on the figure
. The symmetry between the two signals is obvious: the lower the heart rate is (for example during the night), the more
irregular it is.

Motivated by these experimental findings, we modeled RRi signals using an MSRP, with a linear relation between
Zand
_{Z} :
g(
Z) =
a_{Z}+
b. We inferred the parameters
aand
busing a simple linear regresssion on all the signals coming from the PhysioNet dataset. A true and a fake RRi, based on MRSP with these parameters, are shown on figure
.

As a second application, we show a realization of a 2D MSRP with
g(
Z) = (1-
Z)
^{2}(therefore,
Zand its regularity vary in an opposite way). We found that this particular
gfunction seems to be suitable to represent typical natural landscapes. See figure
.

In collaboration with Michal Rams (IMPAN, Warsaw).

We have re-examined the simplified model of TCP traffic that was studied in
, where the Haussdorff multifractal spectrum was obtained. Our aim was to compute this time the large deviation
spectrum, which is more adapted to numerical estimation. Our model may be described as follows: Let
be a non-decreasing sequence of positive numbers. For every
i1, let
be a sequence of independent exponential random variables with parameter
_{i}. Define
_{0}^{(
i)}= 0. Set

The
-algebras
(
_{k}^{(
i)},
k1)are assumed to be mutually independent.

We consider an infinite sequence of sources
. The “traffic"
generated by the source
S_{i},
i1, is modeled by the following stochastic process

where is a sequence of non-negative random variables such that the series converge, and is a fixed real number larger than one (typically equal to 2 in the case of TCP).

The resulting “global traffic" is the stochastic process

We have proved the following result: Fix
L>1, and denote:

L_{k}=
L^{k},
M_{k}= #{
_{j}<
L
_{k}}.

For
k2, set
N_{k}=
M_{k}-
M_{k-1}. For any given
_{0}>0, there exists a sequence
(
a
_{k})increasing to infinity and such that:

The sequence
(
_{i})is said to be
*regular*if, for any given
_{0}>0, the sequence
(
a
_{k}-
a
_{k-1})
_{k>1}is bounded.

When
(
_{i})is not regular, we set:

**Theorem 1**Assume
so that, with probability one,
Zis well defined.

Assume
(
_{i})is regular. Then, almost surely, the large deviation multifractal spectrum of
Zis:

Given any
1
_{1}
_{2}
2, there exits a sequence
(
_{i})such that
, and, almost surely:

This shows that the large deviation spectrum holds more information than the Haussdorff one, and also that its theoretical shape is in agreement with numerical estimates .

In collaboration with Kenneth Falconer (Univ. St Andrews, Scotland).

We havre continued the study of multistable and other processes with prescribed local form . This year results include a finer characterization of stationary processes with prescribed local form and an alternate definition of multistable processes, that might prove more efficient in certain situations. We have also started to work on the estimation problem for the varying stability index.

In collaboration with Gabriela Ochoa (Univ. Nittingham, UK).

The parisian approach belongs to the general class of cooperative co-evolutionary algorithms (CCEAs), that represent a natural extension of standard EAs for tackling complex problems. Co-evolutionary algorithms can be generally defined as a class of EAs in which the fitness of an individual depends on its relationship to other members of the population. Several co-evolutionary approaches have been proposed in the literature; they vary widely, but the most fundamental classification relies on the distinction between cooperation and competition. Most work on this domain has been done on competititive models, however there is a increased interest in cooperative models to tackle difficult optimisation problems by means of problems decomposition.

In this work, We propose using the so called
*Royal Road*functions as test functions for cooperative co-evolutionary algorithms (CCEAs). The Royal Road functions were created in the early 90's with the aim of demonstrating the
superiority of GAs over local search methods. Unexpectedly, the opposite was found true, but this research conducted to understanding the phenomenon of
*hitchhiking*whereby unfavorable alleles may become established in the population following an early association with an instance of a highly fit schema. Here, we take advantage of the
modular and hierarchical structure of the Royal Road functions to adapt them to the co-evolutionary setting. Using a multiple population approach, we show that a CCEA easily outperforms a
standard GA on the Royal Road functions, by naturally overcoming the hitchhiking effect. Moreover, we found that the optimal number of sub-populations for the CCEA is not the same as the number
of components the function can be linearly separated, and propose an explanation for this behavior. We argue that this class of functions may serve in foundational studies of cooperative
co-evolution
.

PhD partnership with Guelma university, Algeria.

This work is a first step toward the design of a cooperative-coevolution GP for symbolic regression, which first output is a selective mutation operator for classical GP. It has been proved on several applications that the use of two fitness measures (local and global) within an evolving population allows to design more efficient optimization schemes (Parisian approach). We currently investigate the use of a two-level fitness measurement for the design of operators (i.e. using local measurements to evaluate blocks of genes of an individual). In a first experiment, we have built a selective mutation operator according to this principle. Experimental analysis on a symbolic regression problem gives evidence of the efficiency of this operator in comparison to classical subtree mutation.

The objective is here to design a new crossover operator for Genetic Programming. Given a selected couple of parents, it is analysed if a locally optimal choice of the crossover node (i.e. a subtrees to be exchanged between parents) improves the search capabilities of a GP, and/or reduces the bloat effect. Classical GP benchmarks as well as a problem of symbolic regression on fractal functions, are used for testing .

In collaboration with Francisco Fernandez de Vega (Univ. Extremadura, Merida, Spain). This collaboration has been supported by a LAFMI grant.

This work studies speciation from the standpoint of evolutionary robotics (ER). In ER, the sensory-motor mappings that control an autonomous agent are designed using a neuro-evolutionary framework. We have studied an extension of this process, where speciation is incorporated to the evolution process in order to obtain a varied set of solutions for the same robotics problem using a single algorithmic run. Although speciation is common in evolutionary computation, it has been less explored in behaviour-based robotics. When employed, speciation usually relies on a distance measure that allows different individuals to be compared. The distance measure is normally computed in objective and phenotypic space. However, the speciation process we have proposed is intended to produce several distinct robot behaviours; hence, speciation is sought in behavioural space. Thence, individuals neurocontrollers are described using behaviour signatures, which represent the traversed path of the robot within the training environment and are encoded using a character string. With this representation, behaviour signatures are compared using the normalised Levenshtein distance metric. Results indicate that speciation in behavioural space does indeed allow ER systems to obtain several navigation strategies for a common experimental setup. This is illustrated by comparing the best individual of each species with those obtained using the Neuro-Evolution of Augmenting Topologies (NEAT) method that speciates neural networks in topological space .

This collaboration has been supported by a LAFMI grant.

The basic problem for a mobile vision system is to determine where it is located within the world. We propose a regognition system that is capable of identifying known places, such as rooms and corridors. The system relies on a bag of features approach, using locally prominent image regions. Real-world locations are modeled using a mixture of Gaussian representations, thus allowing for a multimodal scene characterisation. Local regions are represented by a set of 108 statistical descriptors computed form different modes of informations. This can now be seen as a search problem, where the system needs to determine which subset of descriptors can capture regularities between image regions of the same location, and can also help discriminate between regions of different places. A genetic algorithm is used for this task, where fitness assignment promotes:

a high classification accuracy,

the selection of a minimal subset of descriptors,

a high separation among place models.

Instead of using a predefined set of descriptors, the system is able to learn which subset of descriptors is most adequate for the specific group of locations that need to be recognised. The system has been tested on two real-world examples. Results confirm the ability of the system to identify previously seen places in a real-world setting, see .

3D reconstruction in Nuclear Medicine imaging using complete Monte-Carlo simulation of trajectories usually requires high computing power. We are currently developing a Parisian Evolutionary strategy in order to reduce the computing cost of reconstruction without degrading the quality of results. Our approach derives from the fly algorithm which proved successful on real-time stereo image sequence processing. Flies are considered here as photons emitters. We developed the marginal fitness technique to calculate the fitness function, an approach usable in Parisian Evolution whenever each individual's fitness cannot be calculated independently of the rest of the population, see

In collaboration with Pierre-Henri Wuillemin (Univ Paris 6).

The role of the APIS team within the INCALIN project is to study the use of evolutionary optimisation methods for the modeling of complex interactions and multilevel dynamics of an agrifood industrial process. These techniques will have to deal with various types of knowledge, and fragmented and/or incomplete data. Experimental data are provided by industrial partners of INCALIN and are collected on two different industrial process: cheese ripening and bread making.

A first study is focussed on the use of Dynamic Bayesian Networks (DBNs) to model these data, in collaboration with the LIP6 laboratory. The use of artificial evolution for the optimisation
of structures of DBN is not new, however current proposed solutions are limited by the problem of efficiently modeling constrained graph structures of DBNs (structure learning problem). The
solution we are studying is based on an equivalent representation of DBNs, based on independence models (IMs) of data. IMs are actually a more general data model, that represents data
dependencies via a set of independency statements (ISs). An IS=
is made of three subsets of mutually exclusive variables, and means that
Ais independent of
Cknowing
B(evaluation by
^{2}tests).

A convenient genomic representation of an
ISallowed us to develop a Parisian evolutionary approach (i.e. the whole population, or a subset of its best individuals, represents a set of ISs, that is an IM). Preliminary experiments
implemented in matlab yield encouraging results.

Future work will concern theoretical and algorithmic questions about the exploitation of an optimised IM to build a DBN. The postdoc of Olivier Barrière will be devoted to this problem.

This study was aimed at extending previous work on interactive image denoising to a problem of multifractal image segmentation. In the same spirit as for multifractal denoising, multifractal segmentation techniques are adapted to complex images and signals (and particularly to medical images), but depend on a set of numerical and symbolic parameters (various numerical thresholds, choices of capacities and spectra, etc). As the tuning of these parameters is complex and highly dependent on psychovisual and subjective factors, the use of an interactive EA to drive this process was a natural choice.

A prototype has been programmed, using the interactive multifractal denoising application (currently included in fraclab) as a basis. Preliminary tests (interactive and non-interactive) yield good results : the versatility of the interactive implementation is a major advantage to handle difficult images like IR or medical images.

In collaboration with Pierre Collet (Univ Strasbourg), Claire Bourgeois-République (Univ. Bourgogne), Vincent Péan (Innotech), Bruno Frachet (Hôpital Avicenne), HEVEA project (French acronym for "Handicap: Etude et Valorisation de l'Ecologie Auditive").

The interactive evolutionary tuning procedure is now functionnal and is currently tested on sets of patients, using a PDA with a graphical interface. Evaluation is based on audio tests. Preliminary tests have shown an improvement of patient audition and comfort with interactive evolution.

This project has been nominated to the 2007 “Humies” Awards for human-competitive results produced by genetic and evolutionary computation of the 2007 GECCO conference (
http://

Current studies also concern the development of a complete prototype on a PDA of a device that adapts to audio environment: the system will automatically switch to the optimal tuning corresponding to the detected audio environment class. This is done in continuation of the post-doctoral work of Pierrick Legrand in 2006 (see ). A collaboration is currently starting with the INRIA DEMAR team (David Guiraud) for the development of adaptive and miniaturized implants.

The team has contracts with:

DASSAULT AVIATION on terrain modelling based on mBm.

Innotech, HEVEA project on Cochlar implants optimisation, and a collaboration with David Guiraud, of DEMAR team, is currently being established for the design of new auditive implants.

EHPOC project of the Pôle de Compétitivité SYSTEM@TIC PARIS-REGION. The partners involved are ECP, INRIA Select, CEA, Dassault Aviation, EDF, EADS. The goal of the project is the development of a generic methodology to manage uncertainties and its demonstration through industrial cases.

OPUS project of the Agence Nationale de la Recherche. The partners involved are ECP, université Paris VII, université de Grenoble, SUPELEC, CEA, EDF, EADS. The goal of the project is the development of a generic software plateform to manage uncertainties. The specific contribution of ECP-SUPELEC is the theoretical development of reduced-order models in a multiphysical context.

ANR project REVES, with Cité des Sciences et de l'Industrie, Univ. Paris V, and two companies: DreamInReal and Laser Technologies. on the use of fly algorithm for augmented reality.

Our project has collaborations with:

IrCcyn, Institut de Recherche en Cybernétique et communications de Nantes, since 1996. Areas of collaborations include the study of mBm, 2-microlocal analysis, image analysis and denoising. In addition, the software FracLab has mainly been developed at IRCCYN in the last five years.

LSIIT, Université Louis Pasteur of Strasbourg (P. Collet) on cochlear implants fitting (HEVEA) and and fly algorithm,

Polytech'Tours (Nicolas Monmarche and Mohamed Slimane) on the use of ant colonies models in graphic design and data classification.

IRD, UMR Biosol (Pascal Jouquet, Michel Lepage), ESTP life modeling group (Emmanuel Cayla) on termite nest modeling.

Clermont Ferrand University (C. Tricot) on multifractal analysis.

An agreement has been signed in 2006 between INRIA, Cetoine, the Angers University, the Lycee de la mode of Cholet and the e-mode technology plattform, in order to experiment new textile applications of the ArtiE-Fract software. An R&D project is currently built among these partners to be presented to the “Pôle Enfant”, a Competitive research pole of the “Pays de la Loire” region.

The team involved in the European SPADE2 project, in the frame of which we have developed a strong collaboration with IMPAN, Warsaw.

APIS has been involved in an European FP7 project proposal (theme 2, Food, Agriculture, Fisheries and Biotechnologies) , on agrifood applications (DREAM : “Design and development of realistic food midels to allow a multidisciplinary and intergrated approach to food quality and nutrition”), with partners of the INCALIN project. This project has been selected for the second stage of the evaluation. A full version of the proposal will be submitted in february 2008.

The COMPLEX team collaborates with:

The Mexican research institute (CICESE, Fsica Aplicada, Pr Gustavo Olague) under a LAFMI grant.

Ecole Polytechnique-CRM, Montreal (F. Nekka) on pharmacodynamics.

Saint Andrews University, Scotland (K. Falconer) on multistable and other processes with prescribed local form.

University of California at Riverside (M. Lapidus) and Acadia University, Canada (F. Mendivil) on multifractal strings.

Bar Ilan university on theoretical developments around set-indexed fractional Brownian motion (invitations of Erick Herbin in Israël during two months in june 2006 and june 2007).

University of Nottingham (Gabriela Ochoa, Edmund Burke, Natalio Krasnogor) on cooperative-coevolution.

Evelyne Lutton, Jacques Lévy Véhel and Fahima Nekka, are organisers of the next "Fractals in Engineering" conference, to be held in Montreal.

Pierre Collet, Evelyne Lutton, and Marc Schoenauer have been involved in the organisation of the << Evolution Artificielle '2007 >> conference (Tours, October 2007), and are members of the steering committee of the French association for artificial evolution.

Erick Herbin is involved in the organization of the continuing education program "Engager et élaborer une démarche incertitudes", under the labels IMdR (Institut de Maitrise des Risques), SMAI (Société de Mathématiques Appliquées et Industrielles), SFdS (Société Française de Statistiques) and TERATEC.

Jacques Lévy Véhel is an associate Editor of the journal “FRACTALS”.

Evelyne Lutton has been co-editor of a book on Genetic and Evolutionary Image Analysis and Signal Processing, with Stefano Cagnoni and Gustavo Olague.

Evelyne Lutton, Stefano Cagnoni and Gustavo Olague are co-editors of a special issue on Evolutionary Computer Vision of the Evolutionary Computation Journal.

J. Lévy Véhel has acted as an expert for the Canadian CRSNG. He is also an expert for the Canadian MITACS network. He is a member of the expert group "Signaux et Traitements Multidimensionnels et Multimodaux".

J. Lévy Véhel has been a referee for IEEE Trans. Image Proc., Fractals, JMNNP, SPA.

Evelyne Lutton has been referee for IEEE Transactions on Evolutionary Computation, IEEE Signal Processing Letters, JESA, SMC-PartB.

Erick Herbin is reviewer for Mathematical Reviews (AMS).

Course "Fractals and Wavelets", at ENSTA (Evelyne Lutton, Jacques Lévy Véhel, 21h)

Course "Fractals and Time-frequency analysis" at Ecole Centrale Nantes (Jacques Lévy Véhel, 7h).

Course "Fractals and wavelets in signal and image processing" at ESIEA (Jacques Lévy Véhel, 15h).

Course "Artificial Evolution" at ENSTA (Evelyne Lutton, Pierre Collet, Cyril Fonlupt, 21h).

Probability Course at Ecole Centrale Paris (Erick Herbin, 20h).

Courses on Lévy Processes at Ecole Centrale Paris (Erick Herbin, 6h).

Travaux dirigés on Real and Complex Analysis at Ecole Centrale Paris (Erick Herbin, 10h).

Jacques Lévy Véhel has given an invited lecture at the workshop on “Fractals and Applications”, held in Milano in May 2007, and at the workshop “Fractal Geometry and Dynamics II” organized at the Stefan Banach International Mathematical Center, Warsaw, November 2007.

Olivier Barrière has defended his PhD thesis (supervisor : J. Lévy Véhel) at the University of Nantes on November 28, 2007: "Synthèse et estimation de mouvements browniens multifractionnaires et autres processus à régularité prescrite. Définition du processus auto-régulé multifractionnaire et applications".

Antoine Echelard has defended his PhD thesis (supervisor : J. Lévy Véhel) at the University of Nantes on November 28, 2007: "Analyse 2-microlocale et application au débruitage".

Jacques Lévy Véhel was a referee on the thesis of Arnaud Capri: “Caractérisation des objets dans une image en vue d'une aide à l'interprétation et d'une compression adaptée au contenu : application aux images échographiques” (May 2007, University of Tours) and of Wahib Arroum: “Time-changed self-similar processes. An application to high frequency financial data” (November 2007, University of Southampton). He is also a member of the thesis committee of David Da Silva (Inria, Virtual Plants Project-team).

Evelyne Lutton was referee of the PhD of Virginie Lefort, “Évolution de second ordre et algorithmes évolutionnaires : l'algorithme RBF-Gene” (July 2007, INSA Lyon). She also is member of the Habilitation committee of Claude Lattaud (December 2007, Orsay University).