MathNeuro focuses on the applications of multi-scale dynamics to neuroscience. This involves the modelling and analysis of systems with multiple time scales and space scales, as well as stochastic effects. We look both at single-cell models, microcircuits and large networks. In terms of neuroscience, we are mainly interested in questions related to synaptic plasticity and neuronal excitability, in particular in the context of pathological states such as epileptic seizures and neurodegenerative diseases such as Alzheimer.

Our work is quite mathematical but we make heavy use of computers for numerical experiments and simulations. We have close ties with several top groups in biological neuroscience. We are pursuing the idea that the "unreasonable effectiveness of mathematics" can be brought, as it has been in physics, to bear on neuroscience.

Modeling such assemblies of neurons and simulating their behavior involves putting together a mixture of the most recent results in neurophysiology with such advanced mathematical methods as dynamical systems theory, bifurcation theory, probability theory, stochastic calculus, theoretical physics and statistics, as well as the use of simulation tools.

We conduct research in the following main areas:

Neural networks dynamics

Mean-field and stochastic approaches

Neural fields

Slow-fast dynamics in neuronal models

Modeling neuronal excitability

Synaptic plasticity

The study of neural networks is certainly motivated by the long term goal to understand how brain is working. But, beyond the comprehension of brain or even of simpler neural systems in less evolved animals, there is also the desire to exhibit general mechanisms or principles at work in the nervous system. One possible strategy is to propose mathematical models of neural activity, at different space and time scales, depending on the type of phenomena under consideration. However, beyond the mere proposal of new models, which can rapidly result in a plethora, there is also a need to understand some fundamental keys ruling the behaviour of neural networks, and, from this, to extract new ideas that can be tested in real experiments. Therefore, there is a need to make a thorough analysis of these models. An efficient approach, developed in our team, consists of analysing neural networks as dynamical systems. This allows to address several issues. A first, natural issue is to ask about the (generic) dynamics exhibited by the system when control parameters vary. This naturally leads to analyse the bifurcations occurring in the network and which phenomenological parameters control these bifurcations. Another issue concerns the interplay between neuron dynamics and synaptic network structure.

Modeling neural activity at scales integrating the effect of thousands of neurons is of central importance for several reasons. First, most imaging techniques are not able to measure individual neuron activity (microscopic scale), but are instead measuring mesoscopic effects resulting from the activity of several hundreds to several hundreds of thousands of neurons. Second, anatomical data recorded in the cortex reveal the existence of structures, such as the cortical columns, with a diameter of about 50

Our group is developing mathematical and numerical methods allowing on one hand to produce dynamic mean-field equations from the physiological characteristics of neural structure (neurons type, synapse type and anatomical connectivity between neurons populations), and on the other so simulate these equations; see Figure . These methods use tools from advanced probability theory such as the theory of Large Deviations and the study of interacting diffusions .

Neural fields are a phenomenological way of describing the activity of population of neurons by delayed integro-differential equations. This continuous approximation turns out to be very useful to model large brain areas such as those involved in visual perception. The mathematical properties of these equations and their solutions are still imperfectly known, in particular in the presence of delays, different time scales and noise.

Our group is developing mathematical and numerical methods for analysing these equations. These methods are based upon techniques from mathematical functional analysis, bifurcation theory , , equivariant bifurcation analysis, delay equations, and stochastic partial differential equations. We have been able to characterize the solutions of these neural fields equations and their bifurcations, apply and expand the theory to account for such perceptual phenomena as edge, texture , and motion perception. We have also developed a theory of the delayed neural fields equations, in particular in the case of constant delays and propagation delays that must be taken into account when attempting to model large size cortical areas , . This theory is based on center manifold and normal forms ideas .

Neuronal rhythms typically display many different timescales, therefore it is important to incorporate this slow-fast aspect in models. We are interested in this modeling paradigm where slow-fast point models, using Ordinary Differential Equations (ODEs), are investigated in terms of their bifurcation structure and the patterns of oscillatory solutions that they can produce. To insight into the dynamics of such systems, we use a mix of theoretical techniques — such as geometric desingularisation and centre manifold reduction — and numerical methods such as pseudo-arclength continuation . We are interested in families of complex oscillations generated by both mathematical and biophysical models of neurons. In particular, so-called *mixed-mode oscillations (MMOs)* , , , which represent an alternation between subthreshold and spiking behaviour, and *bursting oscillations* , , also corresponding to experimentally observed behaviour ; see Figure . We are working on extending these results to spatially-extended neural models .

Excitability refers to the all-or-none property of neurons , . That is, the ability to respond nonlinearly to an input with a dramatic change of response from “none” — no response except a small perturbation that returns to equilibrium — to “all” — large response with the generation of an action potential or spike before the neuron returns to equilibrium. The return to equilibrium may also be an oscillatory motion of small amplitude; in this case, one speaks of resonator neurons as opposed to integrator neurons. The combination of a spike followed by subthreshold oscillations is then often referred to as mixed-mode oscillations (MMOs) . Slow-fast ODE models of dimension at least three are well capable of reproducing such complex neural oscillations. Part of our research expertise is to analyse the possible transitions between different complex oscillatory patterns of this sort upon input change and, in mathematical terms, this corresponds to understanding the bifurcation structure of the model. Furthermore, the shape of time series of this sort with a given oscillatory pattern can be analysed within the mathematical framework of dynamic bifurcations; see the section on slow-fast dynamics in Neuronal Models. The main example of abnormal neuronal excitability is hyperexcitability and it is important to understand the biological factors which lead to such excess of excitability and to identify (both in detailed biophysical models and reduced phenomenological ones) the mathematical structures leading to these anomalies. Hyperexcitability is one important trigger for pathological brain states related to various diseases such as chronic migraine, epilepsy or even Alzheimer's Disease. A central central axis of research within our group is to revisit models of such pathological scenarios, in relation with a combination of advanced mathematical tools and in partnership with biological labs.

Neural networks show amazing abilities to evolve and adapt, and to store and process information. These capabilities are mainly conditioned by plasticity mechanisms, and especially synaptic plasticity, inducing a mutual coupling between network structure and neuron dynamics. Synaptic plasticity occurs at many levels of organization and time scales in the nervous system . It is of course involved in memory and learning mechanisms, but it also alters excitability of brain areas and regulates behavioral states (e.g., transition between sleep and wakeful activity). Therefore, understanding the effects of synaptic plasticity on neurons dynamics is a crucial challenge.

Our group is developing mathematical and numerical methods to analyse this mutual interaction. On the one hand, we have shown that plasticity mechanisms , , Hebbian-like or STDP, have strong effects on neuron dynamics complexity, such as synaptic and propagation delays , dynamics complexity reduction, and spike statistics.

Prediction is the ability of the brain to quickly activate a target concept in response to a related stimulus (prime). Experiments point to the existence of an overlap between the populations of the neurons coding for different stimuli, and other experiments show that prime-target relations arise in the process of long term memory formation. The classical modelling paradigm is that long term memories correspond to stable steady states of a Hopfield network with Hebbian connectivity. Experiments show that short term synaptic depression plays an important role in the processing of memories. This leads naturally to a computational model of priming, called latching dynamics; a stable state (prime) can become unstable and the system may converge to another transiently stable steady state (target). Hopfield network models of latching dynamics have been studied by means of numerical simulation, however the conditions for the existence of this dynamics have not been elucidated. In this work we use a combination of analytic and numerical approaches to confirm that latching dynamics can exist in the context of a symmetric Hebbian learning rule, however lacks robustness and imposes a number of biologically unrealistic restrictions on the model. In particular our work shows that the symmetry of the Hebbian rule is not an obstruction to the existence of latching dynamics, however fine tuning of the parameters of the model is needed.

A natural follow-up of the work which has lead to the article has been initiated through the postdoc project of Elif Köksal Ersöz. The objective is to extend the previous results in several ways. First, to gain more robustness in the heteroclinic chains sustained by the network model. Second, to be able to simulate much larger networks and exhibit heteroclinic dynamics in them. Third, to link with experimental data. The postdoc of Elif Köksal Ersöz, which finished at the end of December 2018, has been funded by the “tail” of the ERC Advanced Grant NerVi held by Olivier Faugeras.

We study pseudo-simple heteroclinic cycles for a

This work has been published in Physica D and is available as .

In view of highly decentralized and diversified power generation concepts, in particular with renewable energies, the analysis and control of the stability and the synchronization of power networks is an important topic that requires different levels of modeling detail for different tasks. A frequently used qualitative approach relies on simplified nonlinear network models like the Kuramoto model with inertia. The usual formulation in the form of a system of coupled ordinary differential equations is not always adequate. We present a new energy-based formulation of the Kuramoto model with inertia as a polynomial port-Hamiltonian system of differential-algebraic equations, with a quadratic Hamiltonian function including a generalized order parameter. This leads to a robust representation of the system with respect to disturbances: it encodes the underlying physics, such as the dissipation inequality or the deviation from synchronicity, directly in the structure of the equations, and it explicitly displays all possible constraints and allows for robust simulation methods. The model is immersed into a system of model hierarchies that will be helpful for applying adaptive simulations in future works. We illustrate the advantages of the modified modeling approach with analytics and numerical results.

This work has been published in Chaos and is available as .

We investigate the dynamics of a population of identical biomolecules mimicked as electric dipoles with random orientations and positions in space and oscillating with their intrinsic frequencies. The biomolecules, beyond being coupled among themselves via the dipolar interaction, are also driven by a common external energy supply. A collective mode emerges by decreasing the average distance among the molecules as testified by the emergence of a clear peak in the power spectrum of the total dipole moment. This is due to a coherent vibration of the most part of the molecules at a frequency definitely larger than their own frequencies corresponding to a partial cluster synchronization of the biomolecules. These results can be verified experimentally via spectroscopic investigations of the strength of the intermolecular electrodynamic interactions, thus being able to test the possible biological relevance of the observed macroscopic mode.

This work has been published in Scientific Reports and is available as .

Information transmission in the human brain is a fundamentally dynamic network process. In partial epilepsy, this process is perturbed and highly synchronous seizures originate in a local network, the so-called epileptogenic zone (EZ), before recruiting other close or distant brain regions. We studied patient-specific brain network models of 15 drug-resistant epilepsy patients with implanted stereotactic electroencephalography (SEEG) electrodes. Each personalized brain model was derived from structural data of magnetic resonance imaging (MRI) and diffusion tensor weighted imaging (DTI), comprising 88 nodes equipped with region specific neural mass models capable of demonstrating a range of epileptiform discharges. Each patients virtual brain was further personalized through the integration of the clinically hypothesized EZ. Subsequent simulations and connectivity modulations were performed and uncovered a finite repertoire of seizure propagation patterns. Across patients, we found that (i) patient-specific network connectivity is predictive for the subsequent seizure propagation pattern; (ii) seizure propagation is characterized by a systematic sequence of brain states; (iii) propagation can be controlled by an optimal intervention on the connectivity matrix; (iv) the degree of invasiveness can be significantly reduced via the here proposed seizure control as compared to traditional resective surgery. To stop seizures, neurosurgeons typically resect the EZ completely. We showed that stability analysis of the network dynamics using graph theoretical metrics estimates reliably the spatiotemporal properties of seizure propagation. This suggests novel less invasive paradigms of surgical interventions to treat and manage partial epilepsy.

This work has been submitted for publication and is available as .

The aim of this paper is to investigate complex dynamic networks which can model high-voltage power grids with renewable, fluctuating energy sources. For this purpose we use the Kuramoto model with inertia to model the network of power plants and consumers. In particular, we analyse the synchronization transition of networks of N phase oscillators with inertia (rotators) whose natural frequencies are bimodally distributed, corresponding to the distribution of generator and consumer power. First, we start from globally coupled networks whose links are successively diluted, resulting in a random Erdös-Renyi network. We focus on the changes in the hysteretic loop while varying inertial mass and dilution. Second, we implement Gaussian white noise describing the randomly fluctuating input power, and investigate its role in shaping the dynamics. Finally, we briefly discuss power grid networks under the impact of both topological disorder and external noise sources.

This work has been published in Europhysics Letters and is available as .

In this work, we propose a new model of biological neural network, combining a two-dimensional integrate-and-fire neuron model with a deterministic model of electrical synapse, and a stochastic model of chemical synapse. We describe the dynamics of a population of neurons in interaction as a piecewise deterministic Markov process. We prove the weak convergence of the associated empirical process, as the population size tends to infinity, towards a McKean-Vlasov type process and we describe the associated PDE. We are also interested in the simulation of these dynamics, in particular by comparing “detailed” simulations of a finite population of neurons with a simulation of the system with infinite population. Benjamin Aymard has the adapted toolkit to attack these questions numerically. The mean field equations studied by Benjamin are of transport type for which numerical methods are technical. However, they are the domain of expertise of Benjamin. His postdoc is funded by the Flagship Human Brain Project.

Latest results are as follows. A first manuscript concerning the numerical simulation of the mean field model is in preparation. We managed to find a new numerical scheme which is positive, semi-implicit and adaptive in time (of order 2). A second manuscript concerning the existence of stationary distribution for the mean field limit is also in preparation.

We study the long time behavior of the solution to some McKean-Vlasov stochastic differential equation (SDE) driven by a Poisson process. In neuroscience, this SDE models the asymptotic dynamic of the membrane potential of a spiking neuron in a large network. We prove that for a small enough interaction parameter, any solution converges to the unique (in this case) invariant measure. To this aim, we first obtain global bounds on the jump rate and derive a Volterra type integral equation satisfied by this rate. We then replace temporary the interaction part of the equation by a deterministic external quantity (we call it the external current). For constant current, we obtain the convergence to the invariant measure. Using a perturbation method, we extend this result to more general external currents. Finally, we prove the result for the non-linear McKean-Vlasov equation.

This work has been submitted for publication and is available as .

In this work, we study the exponential stability of the stationary distribution of a McKean-Vlasov equation, of nonlinear hyperbolic type which was recently derived in , . We complement the convergence result proved in using tools from dynamical systems theory. Our proof relies on two principal arguments in addition to a Picard-like iteration method. First, the linearized semigroup is positive which allows to precisely pinpoint the spectrum of the infinitesimal generator. Second, we use a time rescaling argument to transform the original quasilinear equation into another one for which the nonlinear flow is differentiable. Interestingly, this convergence result can be interpreted as the existence of a locally exponentially attracting center manifold for a hyperbolic equation.

This work has been submitted for publication and is available as .

Consider a large number n of neurons, each being connected to approximately N other ones, chosen at random. When a neuron spikes, which occurs randomly at some rate depending on its electric potential, its potential is set to a minimum value vmin, and this initiates, after a small delay, two fronts on the (linear) dendrites of all the neurons to which it is connected. Fronts move at constant speed. When two fronts (on the dendrite of the same neuron) collide, they annihilate. When a front hits the soma of a neuron, its potential is increased by a small value

This work has been submitted for publication and is available as .

This project focuses on Mean-Field descriptions or thermodynamics limits of large populations of neurons. They study a system of Stochastic Differential Equations (SDEs) which describes the evolution of membrane potential of each neuron over the time when the synaptic weights are random variables (not assumed to be independent). This setup is well suited to Émilie, who has worked during her PhD and first postdoc on mathematical statistical physics and stochastic processes. Her postdoc is funded by the Flagship Human Brain Project. A manuscript is in preparation.

We propose a neural field model of color perception in context, for the visual area V1 in the cortex. This model reconciles into a common framework two opposing perceptual phenomena, simultaneous contrast and chromatic assimilation. Previous works showed that they act simultaneously, and can produce larger shifts in color matching when acting in synergy with a spatial pattern. At some point in an image,the color perceptually seems more similar to that of the adjacent locations, while being more dissimilar from that of remote neighbors. The influence of neighbors hence reverses its nature above some characteristic scale. Our model fully exploits the balance between attraction and repulsion in color space, combined at small or large scales in physical space. For that purpose we rely on the opponent color theory introduced by Hering, and suppose a hypercolumnar structure coding for colors. At some neural mass, the pointwise influence of neighbors is spatially integrated to obtain the final effect that we call a color sensation. Alongside this neural field model, we describe the search for a color match in asymmetric matching experiments as a mathematical projector. We validate it by fitting the parameters of the model to data from and and our own data. All the results show that we are able to explain the nonlinear behavior of the observed shifts along one or two dimensions in color space, which cannot be done using a simple linear model.

This work has been submitted for publication and is available as .

We examine the origin of complex bursting oscillations in a phenomenological ordinary differential equation model with three time scales. We show that bursting solutions in this model arise from a Hopf bifurcation followed by a sequence of spike-adding transitions, in a manner reminiscent of spike-adding transitions previously observed in systems with two time scales. However, the details of the process can be much more complex in this three-time-scale context than in two-time-scale systems. In particular, we find that spike-adding can involve canard explosions occurring on two different time scales and is associated with passage near a folded-saddle singularity. We show that the character of the bursting and the form of spike-adding transitions that occur depend on the geometry of certain singular limit systems, specifically the relative positions of the critical and superslow manifolds. We also show that, unlike the case of spike-adding in two-time-scale systems, the onset of a new spike in our model is not typically associated with a local maximum in the period of the bursting oscillation.

This work has been published in SIAM Journal on Applied Dynamical Systems and is available as .

A minimal system for parabolic bursting, whose associated slow flow is integrable, is presented and studied both from the viewpoint of bifurcation theory of slow-fast systems, of the qualitative analysis of its phase portrait and of numerical simulations. We focus the analysis on the spike-adding phenomenon. After a reduction to a periodically forced 1-dimensional system, we uncover the link with the dips and slices first discussed by J. E. Littlewood in his famous articles on the periodically forced van der Pol system.

This work has been submitted for publication and is available as .

In this work we have gathered recent results on piecewise-linear (PWL) slow-fast dynamical systems in the canard regime. By focusing on minimal systems in

This work has been published as a chapter in the book “Nonlinear Systems, Vol. 1: Mathematical Theory and Computational Methods” published by Springer as part of the Understanding Complex Systems book series, and it is available as .

Neurons can anticipate incoming signals by exploiting a physiological mechanism not well understood. This article offers a novel explanation on how a receiver neuron can predict the sender's dynamics in a unidirectionally-coupled configuration, in which both sender-receiver follow the evolution of a multi-scale excitable system. We present a novel theoretical view point based on a mathematical object, called canard, to explain anticipation in excitable systems. We provide a numerical approach, which allows to determine the transient effects of canards. To demonstrate the general validity of canard-mediated anticipation in the context of excitable systems, we illustrate our framework in two examples, a multi-scale radio-wave circuit (the van der Pol model) that inspired a caricature neuronal model (the FitzHugh-Nagumo model) and a biophysical neuronal model (a 2-dimentional reduction of the Hodgkin-Huxley model), where canards act as messengers to the senders' prediction. We also propose an experimental paradigm that would enable experimental neuroscientists to validate our predictions. We conclude with an outlook to possible fascinating research avenues to further unfold the mechanisms underpinning anticipation. We envisage that our approach can be employed to a wider class of excitable systems with appropriate theoretical extensions. Anticipation appears as a counter-intuitive observation in a wide range of dynamical systems ranging from biology to engineering applications. It can occur in unidirectionally coupled systems when the receiver is subject to a self-delayed feedback in addition to a signal coming from the sender. This particular interaction permits the receiver to predict the future trajectory of the sender. Anticipation can occur transiently, thus straightforwardly denoted anticipation, or in long-term dynamics, in which case it is referred to as anticipated synchronization. In this study, we focus on both aspects of anticipatory dynamics in the context of excitable systems and explain it via a counter-intuitive phenomenon, namely canards. Canard trajectories structure the excitability and synchronization properties of multiple timescale systems exhibiting excitable dynamics. By developing a theoretical framework enhanced by numerical continuation, we show that the underlying canard structure in excitable systems is responsible for delaying sub-threshold solutions, but anticipating the spiking ones. We also propose an experimental set up that would enable experimentalists to observe anticipated behavior in neural systems, in particular in type-II neurons.

This work has been accepted for publication in Chaos and is available as .

In this work we have revisited a rate model that accounts for the spontaneous activity in the developing spinal cord of the chicken embryo . The dynamics is that of a classical square-wave burster, with alternation of silent and active phases. Tabak et al. have proposed two different three-dimensional (3D) models with variables representing average population activity, fast activity-dependent synaptic depression and slow activity-dependent depression of two forms. In , , various 3D combinations of these four variables have been studied further to reproduce rough experimental observations of spontaneous rhythmic activity. In this work, we have first shown the spike-adding mechanism via canards in one of these 3D models from where the fourth variable was treated as a control parameter. Then we discussed how a canard-mediated slow passage in the 4D model explains the sub-threshold oscillatory behavior which cannot be reproduced by any of the 3D models, giving rise to mixed-mode bursting oscillations (MMBOs); see . Finally, we relateed the canard-mediated slow passage to the intervals of burst and silent phase which have been linked to the blockade of glutamatergic or GABAergic/glycinergic synapses over a wide range of developmental stages .

This work is in progress and is available as .

We analyzed a generic relaxation oscillator under moderately strong forcing at a frequency much greater that the natural intrinsic frequency of the oscillator. Additionally, the forcing is of the same sign and, thus, has a nonzero average, matching neuroscience applications. We found that, first, the transition to high-frequency synchronous oscillations occurs mostly through periodic solutions with virtually no chaotic regimes present. Second, the amplitude of the high-frequency oscillations is large, suggesting an important role for these oscillations in applications. Third, the 1:1 synchronized solution may lose stability, and, contrary to other cases, this occurs at smaller, but not at higher frequency differences between intrinsic and forcing oscillations. We analytically built a map that gives an explanation of these properties. Thus, we found a way to substantially “overclock” the oscillator with only a moderately strong external force. Interestingly, in application to neuroscience, both excitatory and inhibitory inputs can force the high-frequency oscillations.

This work has been published in Physical Review E and is available as .

Title: The Human Brain Project

Program: FP7

Duration: October 2013 - March 2016 (first part), then : April 2016 - March 2018 (second part) and then : April 2018 - March 2020 (third part)

Coordinator: EPFL

Partners:

see the webpage of the project.

Olivier Faugeras is leading the task T4.1.3 entitled “Meanfield and population models” of the Worpackage W4.1 “Bridging Scales”.

Inria contact: Olivier Faugeras (first part) and then : Romain Veltz (second and third part)

Understanding the human brain is one of the greatest challenges facing 21st century science. If we can rise to the challenge, we can gain profound insights into what makes us human, develop new treatments for brain diseases and build revolutionary new computing technologies. Today, for the first time, modern ICT has brought these goals within sight. The goal of the Human Brain Project, part of the FET Flagship Programme, is to translate this vision into reality, using ICT as a catalyst for a global collaborative effort to understand the human brain and its diseases and ultimately to emulate its computational capabilities. The Human Brain Project will last ten years and will consist of a ramp-up phase (from month 1 to month 36) and subsequent operational phases.

This Grant Agreement covers the ramp-up phase. During this phase the strategic goals of the project will be to design, develop and deploy the first versions of six ICT platforms dedicated to Neuroinformatics, Brain Simulation, High Performance Computing, Medical Informatics, Neuromorphic Computing and Neurorobotics, and create a user community of research groups from within and outside the HBP, set up a European Institute for Theoretical Neuroscience, complete a set of pilot projects providing a first demonstration of the scientific value of the platforms and the Institute, develop the scientific and technological capabilities required by future versions of the platforms, implement a policy of Responsible Innovation, and a programme of transdisciplinary education, and develop a framework for collaboration that links the partners under strong scientific leadership and professional project management, providing a coherent European approach and ensuring effective alignment of regional, national and European research and programmes. The project work plan is organized in the form of thirteen subprojects, each dedicated to a specific area of activity.

A significant part of the budget will be used for competitive calls to complement the collective skills of the Consortium with additional expertise.

Invitation of Andrey Shilnikov, Georgia State University (USA), January 2018

Invitation of Jean-Pierre Françoise, Sorbonne Université (Paris), April 2018

Invitation of Vivien Kirk, University of Auckland (New Zealand), May 2018

Invitation of Peter De Maesschalck, University of Hasselt (Belgium), June 2018

Visit of Mathieu Desroches to Jean-Pierre Françoise (LJLL, Sorbonne Université, Paris) in October 2018

One-month research stay of Mathieu Desroches at BCAM (Bilbao, Spain) on an invited professor scholarship to work with Serafim Rodrigues, June 2018

Olivier Faugeras and Romain Veltz were on the Advisory Board of the 4th International Conference on Mathematical Neuroscience, held in Antibes Juan les Pins (France), June 11 - 13, 2018.

Mathieu Desroches was on the Program Committee of the 4th International Conference on Mathematical Neuroscience, held in Antibes Juan les Pins (France), June 11 - 13, 2018.

Mathieu Desroches was on the Scientific Committe of the 2nd International workshop in Neurodynamics (Ndy'18), held in Castro-Urdiales (Spain), September 26-29, 2018.

Olivier Faugeras is the co-editor in chief of the open access Journal of Mathematical Neuroscience. This journal has obtained in 2018 an Impact Factor of 2.4.

Fabien Campillo acts as a reviewer for Journal of Mathematical Biology.

Mathieu Desroches acts as a reviewer for Physica D, SIAM Journal on Applied Dynamical Systems (SIADS), PLoS Computational Biology, Journal of Mathematical Biology, Journal of Neurophysiology, Journal of Mathematical Neuroscience, Nonlinear Dynamics.

Olivier Faugeras acts as a reviewer for the Journal of Mathematical Neuroscience, the Journal of Computational Neuroscience, the SIAM Journal on Applied Dynamical Systems (SIADS).

Martin Krupa acts as a reviewer for Nonlinearity, Proceedings of the National Academy of Sciences of the USA (PNAS), the SIAM Journal of Applied Dynamical Systems (SIADS).

Romain Veltz acts as a reviewer for Neural Computation, eLIFE, SIADS, PNAS, Journal of the Royal Society Interface.

Pascal Chossat, “Stability of Simple and Pseudo Simple Heteroclinic Cycles in

Pascal Chossat, “Computational aspects of equivariant bifurcation theory”, Colloque *Symmetry and Computation*, CIRM (France), April 2018.

Pascal Chossat, “Geometry in neurosciences: the example of the visual cortex” (plenary talk), *Systèmes dynamiques et systèmes complexes - Une conférence pour célébrer les 60 ans de Jean-Marc Gambaudo*, Nice (France), Juin 2018.

Pascal Chossat, “Perception of images by the visual cortex: geometry in neuroscience” (plenary talk), *VI Iberoamerican meeting Geometry, Mechanics and Control*, Guanajuato (Mexico), August 2018.

Pascal Chossat, “Transitions de phase dans le plan hyperbolique, application à la détection des textures par le cortex visuel primaire”, Colloque *Géométrie et représentation de la couleur*, UPMC, November 2018.

Mathieu Desroches, “Canards and spike-adding in neural bursters”, MURPHYS-HSFS-2018 meeting, CRM, Barcelona (Spain), May 2018.

Mathieu Desroches, “Three-Timescale Dynamics: Canards and Spike Adding”, New Trends in Mathematical Biology, CRM, Barcelona (Spain), June 2018.

Olivier Faugeras, “Predicting neuronal correlations in large size networks based on meanfield analysis”, Workshop *Mean-field Approaches to the Dynamics of Neuronal Networks*, EITN, Paris (France), January 2018.

Olivier Faugeras, “Neural networks do not become asynchronous in the large size limit when synaptic weights are correlated: there is no propagation of chaos”, Worshop *InSpire – New Insights on Complex Neural Dynamics*, , Cergy (France), June 2018.

Olivier Faugeras, “Neural networks do not become asynchronous in the large size limit: there is no propagation of chaos”, ICMNS 2018, Antibes (France), June 2018.

Elif Köksal Ersöz, “Anticipation via canards”, MURPHYS-HSFS-2018 meeting, CRM, Barcelona (Spain), May 2018.

Elif Köksal Ersöz, “Anticipation via canards in excitable systems”, XXIII National Conference on Statistical Physics and Complex Systems, Parma (Italy), June 2018.

Elif Köksal Ersöz, “Canard mediated mixed-mode bursting oscillations in a rate model”, European Conference on Mathematical and Theoretical Biology (ECMTB 2018), Lisbonne (Portugal), July 2018.

Elif Köksal Ersöz, “Canard mediated mixed-mode bursting oscillations in a rate model”, 2nd International workshop in Neurodynamics (Ndy'18), Castro-Urdiales (Spain), September 2018.

Martin Krupa, “Heteroclinic chains in a model of associative memory”, Systèmes dynamiques et systèmes complexes - Une conférence pour célébrer les 60 ans de Jean-Marc Gambaudo, Nice (France), June 2018.

Romain Veltz, “On a toy network of neurons interacting through nonlinear dendritic compartments”, ICMNS 2018, invited talk, June 2018.

Romain Veltz, “On a toy network of neurons interacting through nonlinear dendritic compartments”, Mean-field approaches to the dynamics of neuronal networks, EITN, 2018.

Romain Veltz, “On a toy network of neurons interacting through nonlinear dendritic compartments”, Systèmes dynamiques et systèmes complexes - Une conférence pour célébrer les 60 ans de Jean-Marc Gambaudo, Nice (France), June 2018.

Romain Veltz, “Models of neurons interacting through nonlinear dendritic compartments”, Séminaire de Probabilités et Statistique, LJAD, Université de Nice Sophia Antipolis, October 2018.

Fabien Campillo was member of the local committee in charge of the scientific selection of visiting scientists (Comité NICE)

Fabien Campillo was member of the HCERES visiting committee for the evaluation of the INRA Research Unit MAIAGE.

Fabien Campillo was member of the visiting committee of the LIRIMA International Laboratory.

Mathieu Desroches was on the Advisory Board of the Complex Systems Academy of the UCA

Olivier Faugeras made a presentation in front of the CCNE (Comité Consultatif National d'Ethique) reporting on the work made by the group “éthique et IA” that he was the president of at the *Académie des Sciences de Paris* in preparation of their revision of the law on bioethics.

Olivier Faugeras was the President of the study group “Intelligence artificielle” of the *Académie des Sciences de Paris*. As such, he led the audition of experts of this research field, namely for 2018, Jean Ponce, Stéphane Mallat and Francis Bach. This study group has also produced a report for the 2019 G7 meeting.

**Chalk-learning**

Master 2 MVA/UPMC: Romain Veltz, Mathematical Methods for Neurosciences, 20 hours, Paris, France.

Master 1 BIM/UPMC: Mathieu Desroches, Modèles Mathématiques et Computationnels en Neuroscience (Lectures and example classes), 20 hours, Paris, France.

Master 1 BIM/UPMC: Elif Köksal Ersöz, Modèles Mathématiques et Computationnels en Neuroscience (Labs), 10 hours, Paris, France.

Mini-course (Master's level), Basque Center for Applied Mathematics: Mathieu Desroches (together with Serafim Rodrigues), Introduction to mathematical neuroscience: neuronal models and their bifurcations, 10 hours, Bilbao, Spain.

PhD in progress: Axel Dolcemascolo, “All optical neuromimetic devices”, started in January 2016, co-supervised by Romain Veltz and Stéphane Barland (INLN). Successfully defended on 14 December 2018.

PhD in progress: Louisiane Lemaire, “Multi-scale mathematical modeling of cortical spreading depression”, started in October 2018, co-supervised by Mathieu Desroches and Martin Krupa.

PhD in progress: Yuri Rodrigues, “Towards a model of post synaptic excitatory synapse”, started in March 2018, co-supervised by Romain Veltz and Hélène Marie (IPMC, Sophia Antipolis).

PhD in progress: Halgurd Taher, “Next generation neural-mass models”, started in November 2018, co-supervised by Simona Olmi and Mathieu Desroches.

PhD in progress: Pascal Helson, “Study of plasticity laws with stochastic processes”, started in September 2016, co-supervised by Romain Veltz and Etienne Tanré (Inria TOSCA).

PhD in progress: Quentin Cormier, “Biological spiking neural networks", started in September 2017, co-supervised by Romain Veltz and Etienne Tanré (Inria TOSCA).

PhD in progress: Samuel Nyobe, “Inférence dans les modèles de Markov cachés : Application en foresterie”, started in October 2017, co-supervised by Fabien Campillo, Serge Moto (University of Yaoundé, Camerun) and Vivien Rossi (CIRAD).

Mathieu Desroches was president of the Jury of the PhD of Dora Karvouniari (Inria Biovision Team) entitled “Mathematical modeling of retinal waves”, Inria Sophia Antipolis, 15 March 2018.

Pascal Chossat was jury member for the HDR of Philippe Beltrame, Université d'Avignon, 30 November 2018.

Romain Veltz was Examiner for A. Dolcemascolo's PhD thesis defence entitled "Semiconductor lasers to model and control cells and excitable networks", which took place on 14 December 2018.

National events: Romain Veltz participated at the conference "Fête de la Science".

Romain Veltz gave a talk at "Café des sciences" about "Vers un modèle de synapse excitatrice"