BIGS is a team labeled by Inria, by CNRS and by Université de Lorraine, via the Institut Élie Cartan (UMR 7502 CNRS-Inria-UL). Our research is mainly focused on statistics and stochastic processes techniques aiming at a better understanding of biological systems. A special attention is devoted to online data analysis, local regression techniques and identification of complex biological systems. Our investigations encompass both theoretical aspects and concrete applications of the issues alluded to above. To be more specific, we focus on the following topics:

*Online Factorial Analysis:*
High dimensional data are often obtained online, and cannot be stored integrally in
a computer memory. One of the recent challenges in data analysis is then to be able to perform an accurate classification or clustering by taking advantage of the possibility of updating the information. This has to be done, of course, in a rather simple and efficient way, allowing real time analysis. To this aim, we use techniques based on some sophisticated tools coming from stochastic approximation.

*Local Regression Techniques:*
The main issue here is the construction of a procedure allowing to assess, in quite a general framework, whether a given model fits a data set regarding most assumptions made in elaborating the model. This is based on a generalization of the Cramér-Von Mises statistics and involves a non parametric estimate of the conditional distribution of the response variable. A detailed analysis of the procedure, including rate of convergence and asymptotic properties, is being performed. The strategy is then implemented for a study concerning fetal biometry.

*Photodynamic therapy:*
Since 1988, some control system scientists and biologists at the Centre de Recherche en Automatique de Nancy (CRAN in short) have worked together to develop the photodynamic therapy (PDT in the sequel), an alternative treatment for cancer, by means of a model-based approach.
The global aim in this direction is to use statistical as well as mechanistic models in order to improve the response reproducibility, help biologists and chemists in the design of new photosensitizing agents and provide insight into complex phenomena associated with oncogenesis, tumor angiogenesis and interactions with the treatment. This heavily relies on the production of accurate and simple enough models involving various type of stochastic processes, such as Markov chains, branching processes and stochastic differential equations. The main questions here concern generally identification or estimation properties, but simulation issues can be important too.

*Estimation for complex biological systems:*
Numerous biological systems are accurately described by multidimensional noisy differential equations driven by Gaussian processes (beyond the realm of Brownian motion) or by fractional fields, for which asymptotic properties and parameter estimation are fruitful informations. We are thus interested in studying this kind of systems, having in mind 3 specific applications of interest for us: Bacteriophage systems, random fluctuation of nanoparticles and automatic detection of osteoporosis.

*Participants: J.-M. Monnez, P. Vallois.*
Generally speaking, there exists an overwhelming amount of articles dealing with the analysis
of high dimensional data. Indeed, this is one of the major challenges in statistics today,
motivated
by internet or biostatistics applications. Within this global picture, the problem of classification
or dimension reduction of online data can be traced back at least to a seminal paper by
Mac Queen , in which
the

Our point of view on the topic relies on the so-called *French data analysis
school*,
and more specifically on Factorial Analysis tools. In this context, it
was then rapidly seen that stochastic approximation was an essential tool
(see Lebart's paper ), which allows one to approximate eigenvectors in a stepwise manner.
A systematic study of Principal Component and Factorial Analysis has then been lead
by Monnez in the series of papers , , , in which many aspects of convergences
of online processes are analyzed thanks to the stochastic approximation techniques.

*Participants: S. Ferrigno, A. Muller-Gueudin.*
In the context where a response variable

Many assumptions must be made to reach it as a possible model. Some require much thinking, as for example, those related to the functional form of *directional*, in the sense that they can detect departures from only one or a few aspects of a null model. For example, many tests have been proposed in the literature to assess the validity of an entertained structural part

With these preliminaries in mind, let us observe that one quantity which embodies all the information about the joint behavior of

The (nonparametric) estimation of this function is thus of primary importance. To this aim, notice that modern estimators are usually based on the local polynomial approach, which has been recognized as superior to classical estimates based on the Nadaraya-Watson approach, and are as good as the recent versions based on spline and other methods. In some recent works , , we address the following questions:

Construction of a global test by means of Cramér-von Mises statistic.

Optimal bandwidth of the kernel used for approximation purposes.

We also obtain sharp estimates on the conditional distribution function in .

*Participants: R. Azaïs, T. Bastogne, C. Lacaux, A. Muller-Gueudin, S. Tindel, P. Vallois, S. Wantz-Mézières*

In most biological contexts, mathematics turn out to be useful in producing accurate models with dual objectives: they should be simple enough and meaningful for the biologist on the one hand, and they should provide some insight on the biological phenomenon at stake on the other hand. We have focused on this kind of issue in various contexts that we shall summarize below.

*Photodynamic Therapy:*
Photodynamic therapy induces a huge demand of interconnected mathematical systems, among which we have studied recently the following ones:

*Bacteriophage therapy:*
Let us mention a starting collaboration between BIGS and the Genetics and Microbiology department at the Universitat Autònoma de Barcelona, on the modeling of bacteriophage therapies. The main objective here is to describe how a certain family of benign viruses is able to weaken a bacterium induced disease, which naturally leads to the introduction of a noisy predator-prey system of equations. It should be mentioned that some similar problems have been treated (in a rather informal way, invoking a linearization procedure) by Carletti in . These tools cannot be applied directly to our system, and our methods are based on concentration and large deviations techniques (on which we already had an expertise , ) in order to combine convergence to equilibrium for the deterministic system and deviations of the stochastic system. Notice that A. Muller-Gueudin is also working with A. Debussche and O. Radulescu on a related topic , namely the convergence of a model of cellular biochemical reactions.

*Gaussian signals:*
Nature provides us with many examples of systems such that the observed signal has a given Hölder regularity, which does not correspond to the one we might expect from a system driven by ordinary Brownian motion. This situation is commonly handled by noisy equations driven by Gaussian processes such as fractional Brownian motion or (in higher dimensions of the parameter) fractional fields.

The basic aspects of differential equations driven by a fractional Brownian motion (fBm) and other Gaussian processes are now well understood, mainly thanks to the so-called *rough paths* tools , but also invoking the Russo-Vallois integration techniques . The specific issue of Volterra equations driven by fBm, which is central for the subdiffusion within proteins problem, is addressed in .

Fractional fields are very often used to model irregular phenomena which exhibit a scale invariance property, fractional Brownian motion being the historical fractional model. Nevertheless, its isotropy property is a serious drawback for instance in hydrology or in medecine (see ). Moreover, the fractional Brownian motion cannot be used to model some phenomena for which the regularity varies with time. Hence, many generalizations (Gaussian or not) of this model have been recently proposed, see for instance for some Gaussian locally self-similar fields, for some non-Gaussian models, for anisotropic models.

Our team has thus contributed , , , , and still contributes , , , , to this theoretical study: Hölder continuity, fractal dimensions, existence and uniqueness results for differential equations, study of the laws to quote a few examples. As we shall see below, this line of investigation also has some impact in terms of applications: we shall discuss how we plan to apply our results to osteoporosis on the one hand and to fluctuations within protein molecules on the other hand.

*Participants: R. Azaïs, T. Bastogne, S. Tindel, P. Vallois, S. Wantz-Mézières*

When one desires to confront theoretical probabilistic models with real data, statistical tools are obviously crucial. We have focused on two of them: parameter identifiability and parameter estimation.

The parameter estimation for a family of probability laws has a very long story in statistics, and we refer to for an elegant overview of the topic. Moving to the references more closely related to our specific projects, let us recall first that the mathematical description of photodynamic therapy can be split up into three parametric models : the uptake model (pharmacokinetics of the photosensitizing drug into cancer cells), the photoreaction model and the tumor growth model. Several papers have been reported for the application of system identification techniques to pharmacokinetics modeling problems. But two issues were ignored in these previous works: presence of timing noise and identification from longitudinal data. In , we have proposed a bounded-error estimation algorithm based on interval analysis to solve the parameter estimation problem while taking into consideration uncertainty on observation time instants. Statistical inference from longitudinal data based on mixed effects models can be performed by the *Monolix* software (http://

A few words should be said about the existing literature on statistical inference for diffusion or related processes, a topic which will be at the heart of three of our projects (namely photodynamic and bacteriophage therapies, as well as fluctuations within molecules). The monograph is a good reference on the basic estimation techniques for diffusion processes. The problem of estimating diffusions observed at discrete times, of crucial importance for applications, has been addressed mainly since the mid 90s. The maximum likelihood techniques, which are also classical for parameter estimation, are well represented by the contributions .

Some attention has been paid recently to the estimation of the coefficients of fractional or multifractional Brownian motion according to a set of observations. Let us quote for instance the nice surveys , . On the other hand, the inference problem for diffusions driven by a fractional Brownian motion is still in its infancy. A good reference on the question is , dealing with some very particular families of equations, which do not cover the cases of interest for us.

*Participants: J.-M. Monnez*

An R package performing most of the methods of factorial analysis in an online way has been developed by R. Bar and J.-M. Monnez. Starting from a simulated data flow, the main goal of the program is to perform online factorial analyses (Principal Component Analyses, Canonical Correlation Analysis, Canonical Discriminant Analysis, Correspondence Analysis). Data are supposed to be independent and identically distributed observations of a random vector (whose distribution is a priori unknown). Defining stochastic approximation processes, the procedure is adaptative in the sense that the results of the analyses are updated recursively each time that a new piece of data is taken into account.

From a theoretical point of view, the i.i.d case has been recently extended to the case of an expectation and/or covariance matrix of the random vector varying with time. We plan to include these improvements into our software.

*Participants: J.-M. Monnez*

A R package called SesIndexCreatoR has been written by B. Lalloué and J.-M. Monnez in order to implement our socio-economic index for health inequalities. The version 1.0 of this package is currently freely available on the website of the Equit'Area project: http://

*Participants: T. Bastogne*

A software *Angio-Analytics* has been developed by J.-B. Tylcz, E. Djermoune and T. Bastogne. This tool allows the pharmacodynamic characterization of anti-vascular effects in anti-cancer treatments. It uses time series of *in vivo* images provided by intra-vital microscopy. Such *in vivo* images are obtained owing to skinfold chambers placed on mice skin, as illustrated in Fig. . The automatized analysis is split up into two steps that were completely performed separately and manually before. The first steps corresponds to image processing to identify characteristics of the vascular network, as illustrated in Fig. . The last step is the system identification of the pharmacodynamic response and the statistical analysis of the model parameters as shown in Fig. and Fig. . An article has been submitted to a journal (Biomedical Signal Processing and Control) and is currently in revision process. Moreover, the current version of the software has been registered to the *Agence de Protection des Programmes*.

*Participants: T. Bastogne*

More than eight million people die from cancer worldwide each year. Current treatment such as chemotherapy and radiotherapy are still limited in terms of benefit/risk ratio. Nevertheless, engineered nanoparticles have opened new interesting perspectives in cancerology, as emphasized by Brigger et al. since 2002. One of these promising solutions is based on the development of nanoparticles able to enhance the cytotoxic effect of radiotherapy. Nevertheless, the preclinical development in nano-medicine is slow, risky and expensive. Recently, Etheridge et al. (2013) highlighted the fact that many of the revolutionary nano-medicine technologies anticipated in the literature may be 20 or more years from clinical use. To speed up the preclinical development of medical engineered nanomaterials, we have designed an integrated computing platform dedicated to the virtual screening of nanostructured materials activated by X-ray making it possible to select nano-objects presenting interesting medical properties faster. That innovation gathers stochastic simulations and statistical modeling to estimate the impact of each design parameter describing the nano-object. That allows us to optimize composition factors in order to suggest one or few promising architectures regarding the medical purpose. The main advantage of this *in silico* design approach is to virtually screen a lot of possible formulations and to rapidly select the most promising ones. The platform can currently handle the accelerated design of radiation therapy enhancing nanoparticles and medical imaging nano-sized contrast agents as well as the comparison between nano-objects and the optimization of existing materials. Other applications related to nano-medicines will be subject to further developments (e.g., photodynamic therapy). That contribution has received the best innovation award from the Institut Mines-Telecom in 2014 and application results will be presented at the 36th PAMM-EORTC Winter Meeting in January 2015.

*Participants: K. Duarte, S. Ferrigno, J.-M. Monnez, A. Muller-Gueudin, S. Tindel*

Consider a data stream and suppose that each data vector is a realization of a random vector whose expectation varies with time, the law of the centered data vector being stationary. Consider the principal component analysis (PCA) of this centered vector called partial PCA. In this study are defined online estimations of the first principal axes by stochastic approximation processes using a data batch at each step of the process or all the data until the current step. This extends a former result obtained by J.-M. Monnez by using one data vector at each step. This is applied to partial generalized canonical correlation analysis by defining a stochastic approximation process of the metric involved in this case using all the data until the current step. If the expectation of the data vector varies according to a linear model, a stochastic approximation process of the model parameters is used. All these processes can be performed in parallel. A forthcoming preprint by R. Bar and J.-M. Monnez will discuss those aspects.

Everyone is subject to environmental exposures from various sources, with negative health impacts (air, water and soil contamination, noise ...) or with positive effects (e.g., green space). Studies considering such complex environmental settings in a global manner are rare. In we propose to use statistical factor and cluster analyses to create a composite exposure index with a data-driven approach, in view to assess the environmental burden experienced by populations. The study was carried out in the Great Lyon area (France, 1.2M inhabitants) at the census block group (BG) scale. We used as environmental indicators ambient air NO2 annual concentrations, noise levels, proximity to green spaces, to industrial plants, to polluted sites and to road traffic. Although it cannot be applied directly for risk or health effect assessment, the resulting index can help to identify hot spots of cumulative exposure, to prioritize urban policies or to compare the environmental burden across study areas in an epidemiological framework.

In supervised learning the number of values of a response variable to predict can be high. Also clustering them in a few clusters can be useful to perform relevant supervised classification analysis. On the other hand selecting relevant covariates is a crucial step to build robust and efficient prediction models, especially when too many covariates are available in regard to the overall sample size. As a first attempt to solve these problems, we had already devised in a previous study an algorithm that simultaneously clusters the levels of a categorical response variable in a limited number of clusters and selects forward the best covariates by alternate minimization of Wilks's Lambda. In the project carried out this year, we first extend the former version of the algorithm to a more general framework where Wilks's Lambda can be replaced by any model selection criterion. We also turned forward selection into stepwise selection in order to remove covariates in real time if necessary. Finally an application of our algorithm to real datasets from peanut allergy studies allowed to get confirmation of some previously published results and suggested new discoveries. The possibilities of this algorithm are promising and it is hoped to be useful for many practitioners.

We describe here an application oriented study lead jointly by J.-M. Monnez and a medical team under the supervision of E. Albuisson at CHU Brabois. The objective is to assess the prognostic value of estimations of volemia, or of their variations, beyond clinical examination in a post-hoc analysis of the Eplerenone Post-Acute Myocardial Infarction (AMI) Heart Failure (HF) Efficacy and Survival Study (EPHESUS). Assessing congestion post-discharge is indeed challenging but of paramount importance to optimize patient management and prevent hospital readmissions. The analysis was performed in a subset on 4957 patients with available data (within a full dataset of 6632 patients). Study endpoint was cardiovascular death and/or hospitalization for HF between month 1 and month 3 after post-AMI HF. Estimated plasma volume variations between baseline and month 1 were estimated by the Strauss formula, which includes hemoglobin and hematocrit ratios. Other potential predictors including congestion surrogates, hemodynamic and renal variables, and medical history variables were tested. An instantaneous estimation of plasma volume at month 1, ePVS M1, was defined and also tested. Multivariate analysis was performed using stepwise logistic regression and linear discriminant analysis. In HF complicating MI, congestion assessed by the Strauss formula and an instantaneous derived measurement of plasma volume displayed an added predictive value of early cardiovascular events, beyond routine clinical assessment. Trials assessing congestion management guided by this simple tool to monitor plasma volume are warranted.

This project fits into the global aim of improving local regression techniques. Indeed, we propose in to study the local linear estimator of the conditional distribution function. Namely, having an i.i.d. sample

where

This estimator is a particular case of the local polynomial estimators. It is the local polynomial estimator of order

We are interested in showing the advantage of this estimator over the Nadaraya-Watson estimator. We show asymptotic results for our estimator (exact rate of uniform consistency), and establish also uniform asymptotic certainty bands for the conditional cumulative distribution function.

We obtain the following result under some assumptions on the cumulative distribution

where

As corollaries of this result, we extend our results to other statistical functions, such as the quantiles and the regression function.

We illustrate our results with simulations and an application on foetopathologic data.

We have also started a study about the regression function in the application on foetopathologic data. We consider the nonparametric model

where

We relate here the beginning of collaboration between A. Gueudin, R. Azaïs and some automatic control researchers in Nancy.

We consider networks, modeled as a graph with nodes and edges representing the agents and their interconnections, respectively. The connectivity of the network, persistence of links and interactions reciprocity influence the convergence speed towards a consensus.

The problem of consensus or synchronization is motivated by different applications as communication networks, power and transport grids, decentralized computing networks, and social or biological networks.

We then consider networks of interconnected dynamical systems, called agents, that are partitioned into several clusters. Most of the agents can only update their state in a continuous way using only inner-cluster agent states. On top of this, few agents also have the peculiarity to rarely update their states in a discrete way by resetting it using states from agents outside their clusters. In social networks, the opinion of each individual evolves by taking into account the opinions of the members belonging to its community. Nevertheless, one or several individuals can change its opinion by interacting with individuals outside its community. These inter-cluster interactions can be seen as resets of the opinions. This leads us to a network dynamics that is expressed in term of reset systems. We suppose that the reset instants arrive stochastically following a Poisson renewal process.

A cancer tumor can be represented for simplicity as an aggregate of cancer cells, each cell behaving according to the same discrete model and independently of the others. Therefore to measure its size evolution, it seems natural to use tools coming from dynamics of population, for instance the logistic model. This deterministic framework is well-known but the stochastic one is worthy of interest. We are currently studying in a model in which we suppose that the size

where

Hermine Biermé (Tours) and Céline Lacaux follow in their collaboration in the study of anisotropic random fields. They have extended their previous works in the framework of conditionally sub-Gaussian random series. For such anisotropic fields, they have obtained a modulus of continuity and a rate of uniform convergence. Their framework allows to study e.g., Gaussian fields, stable random fields and multi-stable random fields.

As mentioned in the *Scientific Foundations* Section, the problem of estimating the coefficients of a general differential equation driven by a Gaussian process is still largely unsolved. To be more specific, the most general (

where *Application Domains* Section) require the analysis of the following

where

To this aim, here are the steps we have focused on in 2014:

A better understanding of the underlying rough path structure for equation (). This includes two studies on differential systems driven by some general Gaussian noises in infinite dimensions: on the Parabolic Anderson model, and about viscosity solutions in the rough paths setting.

Study of densities for general systems driven by Gaussian noises as in and .

Ergodic aspects, which are another important ingredient for estimation procedures for stochastic differential equations, are handled in .

In extreme value theory, one of the major topics is the study of the limiting behavior of the partial maxima of a stationary sequence. When this sequence is i.i.d., the unique limiting process is well-known and called the extremal process. Considering a long memory stable sequence, the limiting process is obtained as a simple power time change extremal process. Céline Lacaux and Gennady Samorodnistky have proved in that this limiting process can also be interpreted as a restriction of a self-affine random sup measure. In addition, they have established that this random measure arises as a limit of the partial maxima of the same long memory stable sequence, but in a different space. Their results open the way to propose new self-similar processes with stationary max-increments.

In a recent work, Godin and Ferraro designed a method to compress tree structures and to quantify their degree of self-nestedness. This method is based on the detection of isomorphic subtrees in a given tree and on the construction of a DAG, equivalent to the original tree, where a given subtree class is represented only once (compression is based on the suppression of structural redundancies in the original tree). In the compressed graph, every node representing a particular subtree in the original tree has exactly the same height as its corresponding node in the original tree.

The degree of self-nestedness is defined as the edit-distance between the considered tree structure and its nearest embedded self-nested version. Indeed, finding the nearest self-nested tree of a structure without more assumptions is conjectured to be an NP-complete or NP-hard problem. We thus design a heuristic method based on interacting simulated annealing algorithms to tackle this difficult question. This procedure is also a keystone in a new topological clustering algorithm for trees that we propose in this work. In addition, we obtain new theoretical results on the combinatorics of self-nested structures. For instance, we have shown that the number

In particular, the cardinality

when

Statistical inference for piecewise-deterministic Markov processes has been extensively investigated for a few years under some ergodicity conditions. Our paper is dedicated to a statistical approach for a particular non ergodic growth-fragmentation model for which the set

We establish that the absorption probability

where

We have shown the convergence in probability of the proposed estimators under some usual asymptotic conditions. In particular, we have,

when

Recent developments on engineered multifunctional nanomaterials have opened new perspectives in oncology. But assessment of both quality and safety in nanomedicine requires new methods for their biological characterization. We have recently proposed a new model-based approach for the pre-characterization of multifunctional nanomaterials pharmacokinetics in small scale in vivo studies. Two multifunctional nanoparticles, with and without active targeting, designed for photodynamic therapy guided by magnetic resonance imaging are used to exemplify the presented method. It allows the experimenter to rapidly test and select the most relevant pharmacokinetic (PK in the sequel) model structure planned to be used in the subsequent explanatory studies. We also show that the model parameters estimated from the in vivo responses provide relevant preliminary information about the tumor uptake, the elimination rate and the residual storage. For some parameters, the accuracy of the estimates is good enough to compare and draw significant pre-conclusions. A third advantage of this approach is the possibility to optimally refine the in vivo protocol for the subsequent explanatory and confirmatory studies complying with the 3Rs (reduction, refinement, replacement) ethical recommendations. More precisely, we show that the identified model may be used to select the appropriate duration of the magnetic resonance imaging sessions planned for the subsequent studies. The proposed methodology integrates magnetic resonance image processing, continuous-time system identification algorithms and statistical analysis. Except, the choice of the model parameters to be compared and interpreted, most of the processing procedure may be automated to speed up the PK characterization process at an early stage of experimentation.

More specifically, our efforts have been split into the following tasks:

Obstacles and challenges to the clinical use of the photodynamic therapy (PDT) are numerous: large inter-individual variability, heterogeneity of therapeutic predictability, lack of in vivo monitoring concerning the reactive oxygen species (ROS) production, etc. All of these factors affect in their ways the therapeutic response of the treatment and can lead to a wild uncertainty on its efficiency. To deal with these variability sources, we have designed and developed an innovative technology able to adapt in realtime the width of light impulses during the photodynamic therapy. The first objective is to accurately control the photobleaching trajectory of the photosensitizer during the treatment with a subsequent goal to improve the efficacy and reproducibility of this therapy. In this approach, the physician a priori defines the expected trajectory to be tracked by the photosensitizer photobleaching during the treatment. The photobleaching state of the PS is regularly measured during the treatment session and is used to change in real-time the illumination signal. This adaptive scheme of the photodynamic therapy has been implemented, tested and validated during in vitro tests. These tests show that controlling the photobleaching trajectory is possible, confirming the technical feasibility of such an approach to deal with inter-individual variabilities in PDT. These results, contained in , open new perspectives since the illumination signal can be different from a patient to another according to his individual response. This study has proven its interest by showing promising results in an in vitro context, which has to be confirmed by the current in vivo experiments. However, it is fair to say that in a near future, the proposed solution could lead, in fine, to an optimized and personalized PDT. A patent was deposited subsequently. Collaboration with CRAN (Nancy).

The communications , and present successful applications of a model-based design of nanoparticles. This approach is based on statistical design of experiments and black-box modeling in cell biology. The associated know-how has been transferred to the start-up CYBERnano. Collaboration with CEA LETI and INSERM (Grenoble).

*Start-up project by T. Bastogne:*

Industrial partner: CYBERnano (Contract Research Organization in NanoMedicine).

Status: SAS created in July 2013.

Comments: a research engineer has been hired by CYBERnano since November 2014 to develop and implement new algorithms devoted to biological signal processing.

*Project "Handle your heart"* Creation of a drug prescription support software for the treatment of heart failure, in
collaboration with the University Hospital of Nancy, headed by J.-M. Monnez.

*Truffinet* (2014), TRUFFles’ microbial interaction Inference by NETwork analysis, Funding organism: PEPS CNRS-Université de Lorraine, Leader: A. Muller-Gueudin. Collaboration with IECL (Anne Gégout-Petit), CRAN (S. Martin, C. Morarescu), INRA (A. Deveau).

*Optique-PDT* (2012-2014), mOdélisation et oPTimisation de l'Irradiance dans les tissus biologiQUEs hétérogènes traités par Thérapie PhotoDynamique interstitielle, Funding organism: *PEPS CNRS-INSERM-Inria*, Leader: M. Thomassin (CRAN, U. Lorraine).

*Nano-Xrays* (2011-2014), Nanoparticles-based X ray-induced photodynamic therapy in glioblastoma multiform, Funding organism: Institut National du Cancer (INCa), Leader: M. Barberi-Heyob (CRAN, U. Lorraine), T. Bastogne.

GDR 3477 Géométrie Stochastique, Leaders: Pierre Calka, David Coupier, Viet Chi Tran, C. Lacaux.

GDR 3475 Analyse Multifractale, Leader: Stéphane Jaffard (C. Lacaux).

*PhotoBrain* (2015-17), AGuIX® theranostic nanoparticles for vascular-targeted interstitial photodynamic therapy of brain tumors, Funding organism: EuroNanoMed II, Leader: M. Barberi-Heyob (CRAN).

(2014-16), A library of Near-InfraRed absorbing photosensitizers: tailoring and assessing photophysical and synergetic photodynamic properties, Funding organism: PHC Bosphore - Campus France, Leader: M. Barberi-Heyob (CRAN).

2014/05/11-2014/05/25: visit of Gennady Samorodnitsky (Cornell, USA) to C. Lacaux.

S. Tindel was on sabbatical at the University of Kansas from August 2013 to June 2014, working on inference for Gaussian systems with D. Nualart and Y. Hu.

Co-organizer of the Journée Fédération Charles Hermite - entreprises, on 2014/01/23, 150 participants and 44 compagnies coming mainly from Lorraine and few of them from Luxembourg (P. Vallois).

Organizer of the meeting “Around Hidden Markov Chains”, at Inria Grand-Est, on 2014/07/08 (P. Vallois).

Co-organizer of the workshop “PL

Organizer of the weekly seminar of the Probability and Statistics group of the Institut Élie Cartan de Lorraine since October 2014 (R. Azaïs).

BIGS is a team whose composition included University staff only until October 2014. All members teach numerous courses, ranging from L1 to M2 levels.

Samy Tindel (192h, Université de Lorraine)

Thierry Bastogne (192h, Université de Lorraine).

Sandie Ferrigno (192h, Université de Lorraine)

Céline Lacaux (192h, Université de Lorraine)

Jean-Marie Monnez (192h, Université de Lorraine)

Aurélie Muller-Gueudin (192h, Université de Lorraine)

Pierre Vallois (192h, University)

Sophie Wantz-Mézières (192h, IUT)

PhD : Geoffrey Nichil, Provisionnement en assurance non vie pour des contrats à maturité longue et à prime unique - Application à la réforme Solvabilité 2, Université de Lorraine, 2014/12/19. Advisor : P. Vallois, S. Hermmann, M. de Calbiac.

PhD comittee : Clémence Chamard-Jovenin, Modélisation du rôle d'ER

PhD Raouf Fakhfakh on 2014/10/02 (Université de Sfax), contribution to the study of Cauchy-Stieltjes kernel families, referee: P. Vallois.

PhD Raghid Zeineddine, 2014/12/01, Université de Lorraine, Sur des nouvelles formules d'Itô en loi, examiner: P. Vallois.

PhD : Samuel Ronsin, Régularité et Représentations Localisées de Textures à Phases Aléatoires, MAP 5, Université Paris Descartes, 2014/12/15, examiner: C. Lacaux.

Fête de la Science, Nancy, École des Mines, 2014/10/17 (A. Muller-Gueudin, S. Tindel, P. Vallois).

Forum Emploi Maths 4, December 2014 (C. Lacaux).

Advisor of a group of students, "La main à la Pâte" project, Institut médico-éducatif (IME), Commercy, September-December 2014 (S. Ferrigno).

Advisor of a group of pupils from Lycée Varoquaux, Tomblaine (Maths en Jeans) (P. Vallois).

Publication of the short science popularization article “Les huîtres ont des oreilles” in the book “Brèves de maths – Mathématiques de la planète Terre”. Editions Nouveau Monde, 2014 (pages 214–215). Authors: Romain Azaïs, Raphaël Coudret and Gilles Durrieu.

C. Lacaux is member of the *comité national universitaire (section 26)* (2011–2015).