Keywords
 A6. Modeling, simulation and control
 A6.1. Methods in mathematical modeling
 A6.1.1. Continuous Modeling (PDE, ODE)
 A6.1.2. Stochastic Modeling
 A6.1.4. Multiscale modeling
 A6.2. Scientific computing, Numerical Analysis & Optimization
 A6.2.1. Numerical analysis of PDE and ODE
 A6.2.2. Numerical probability
 A6.2.3. Probabilistic methods
 A6.3. Computationdata interaction
 A6.3.4. Model reduction
 B1. Life sciences
 B1.2. Neuroscience and cognitive science
 B1.2.1. Understanding and simulation of the brain and the nervous system
 B1.2.2. Cognitive science
1 Team members, visitors, external collaborators
Research Scientists
 Mathieu Desroches [Team leader, Inria, Researcher, HDR]
 Fabien Campillo [Inria, Senior Researcher, HDR]
 Pascal Chossat [CNRS, Emeritus, HDR]
 Olivier Faugeras [Inria, Emeritus, HDR]
 Maciej Krupa [Université Côte d'Azur, Senior Researcher, HDR]
 Simona Olmi [Inria, Starting Research Position, Jan 2021]
 Romain Veltz [Inria, Researcher, until Apr 2021]
PostDoctoral Fellow
 Mattia Sensi [Inria, From 12/2021]
PhD Students
 Louisiane Lemaire [Inria]
 Yuri Rodrigues [Univ Côte d'Azur, until Jun 2021]
 Halgurd Taher [Inria]
Technical Staff
 Emre Baspinar [Inria, Engineer, until Mar 2021]
Interns and Apprentices
 Efstathios Pavlidis [Univ Côte d'Azur, from Mar 2021]
Administrative Assistant
 MarieCecile Lafont [Inria]
External Collaborators
 Daniele Avitabile [Université libre d'Amsterdam  PaysBas, until Nov 2021, HDR]
 Emre Baspinar [CNRS, from Apr 2021]
2 Overall objectives
MathNeuro focuses on the applications of multiscale dynamics to neuroscience. This involves the modelling and analysis of systems with multiple time scales and space scales, as well as stochastic effects. We look both at singlecell models, microcircuits and large networks. In terms of neuroscience, we are mainly interested in questions related to synaptic plasticity and neuronal excitability, in particular in the context of pathological states such as epileptic seizures and neurodegenerative diseases such as Alzheimer.
Our work is quite mathematical but we make heavy use of computers for numerical experiments and simulations. We have close ties with several top groups in biological neuroscience. We are pursuing the idea that the "unreasonable effectiveness of mathematics" can be brought, as it has been in physics, to bear on neuroscience.
Modeling such assemblies of neurons and simulating their behavior involves putting together a mixture of the most recent results in neurophysiology with such advanced mathematical methods as dynamical systems theory, bifurcation theory, probability theory, stochastic calculus, theoretical physics and statistics, as well as the use of simulation tools.
We conduct research in the following main areas:
 Neural networks dynamics
 Meanfield and stochastic approaches
 Neural fields
 Slowfast dynamics in neuronal models
 Modeling neuronal excitability
 Synaptic plasticity
 Memory processes
 Visual neuroscience
3 Research program
3.1 Neural networks dynamics
The study of neural networks is certainly motivated by the long term goal to understand how brain is working. But, beyond the comprehension of brain or even of simpler neural systems in less evolved animals, there is also the desire to exhibit general mechanisms or principles at work in the nervous system. One possible strategy is to propose mathematical models of neural activity, at different space and time scales, depending on the type of phenomena under consideration. However, beyond the mere proposal of new models, which can rapidly result in a plethora, there is also a need to understand some fundamental keys ruling the behaviour of neural networks, and, from this, to extract new ideas that can be tested in real experiments. Therefore, there is a need to make a thorough analysis of these models. An efficient approach, developed in our team, consists of analysing neural networks as dynamical systems. This allows to address several issues. A first, natural issue is to ask about the (generic) dynamics exhibited by the system when control parameters vary. This naturally leads to analyse the bifurcations 52 53 occurring in the network and which phenomenological parameters control these bifurcations. Another issue concerns the interplay between the neuron dynamics and the synaptic network structure.
3.2 Meanfield and stochastic approaches
Modeling neural activity at scales integrating the effect of thousands of neurons is of central importance for several reasons. First, most imaging techniques are not able to measure individual neuron activity (microscopic scale), but are instead measuring mesoscopic effects resulting from the activity of several hundreds to several hundreds of thousands of neurons. Second, anatomical data recorded in the cortex reveal the existence of structures, such as the cortical columns, with a diameter of about 50$\mu m$ to 1$mm$, containing of the order of one hundred to one hundred thousand neurons belonging to a few different species. The description of this collective dynamics requires models which are different from individual neurons models. In particular, when the number of neurons is large enough averaging effects appear, and the collective dynamics is well described by an effective meanfield, summarizing the effect of the interactions of a neuron with the other neurons, and depending on a few effective control parameters. This vision, inherited from statistical physics requires that the space scale be large enough to include a large number of microscopic components (here neurons) and small enough so that the region considered is homogeneous.
Our group is developing mathematical and numerical methods allowing on one hand to produce dynamic meanfield equations from the physiological characteristics of neural structure (neurons type, synapse type and anatomical connectivity between neurons populations), and on the other so simulate these equations; see Figure 1. These methods use tools from advanced probability theory such as the theory of Large Deviations 44 and the study of interacting diffusions 3.
3.3 Neural fields
Neural fields are a phenomenological way of describing the activity of population of neurons by delayed integrodifferential equations. This continuous approximation turns out to be very useful to model large brain areas such as those involved in visual perception. The mathematical properties of these equations and their solutions are still imperfectly known, in particular in the presence of delays, different time scales and noise.
Our group is developing mathematical and numerical methods for analysing these equations. These methods are based upon techniques from mathematical functional analysis, bifurcation theory 20, 55, equivariant bifurcation analysis, delay equations, and stochastic partial differential equations. We have been able to characterize the solutions of these neural fields equations and their bifurcations, apply and expand the theory to account for such perceptual phenomena as edge, texture 38, and motion perception. We have also developed a theory of the delayed neural fields equations, in particular in the case of constant delays and propagation delays that must be taken into account when attempting to model large size cortical areas 21, 54. This theory is based on center manifold and normal forms ideas 19.
3.4 Slowfast dynamics in neuronal models
Neuronal rhythms typically display many different timescales, therefore it is important to incorporate this slowfast aspect in models. We are interested in this modeling paradigm where slowfast point models, using Ordinary Differential Equations (ODEs), are investigated in terms of their bifurcation structure and the patterns of oscillatory solutions that they can produce. To insight into the dynamics of such systems, we use a mix of theoretical techniques — such as geometric desingularisation and centre manifold reduction 48 — and numerical methods such as pseudoarclength continuation 42. We are interested in families of complex oscillations generated by both mathematical and biophysical models of neurons. In particular, socalled mixedmode oscillations (MMOs)12, 40, 47, which represent an alternation between subthreshold and spiking behaviour, and bursting oscillations41, 46, also corresponding to experimentally observed behaviour 39; see Figure 2. We are working on extending these results to spatiotemporal neural models 2.
3.5 Modeling neuronal excitability
Excitability refers to the allornone property of neurons 43, 45. That is, the ability to respond nonlinearly to an input with a dramatic change of response from “none” — no response except a small perturbation that returns to equilibrium — to “all” — large response with the generation of an action potential or spike before the neuron returns to equilibrium. The return to equilibrium may also be an oscillatory motion of small amplitude; in this case, one speaks of resonator neurons as opposed to integrator neurons. The combination of a spike followed by subthreshold oscillations is then often referred to as mixedmode oscillations (MMOs) 40. Slowfast ODE models of dimension at least three are well capable of reproducing such complex neural oscillations. Part of our research expertise is to analyse the possible transitions between different complex oscillatory patterns of this sort upon input change and, in mathematical terms, this corresponds to understanding the bifurcation structure of the model. Furthermore, the shape of time series of this sort with a given oscillatory pattern can be analysed within the mathematical framework of dynamic bifurcations; see the section on slowfast dynamics in Neuronal Models. The main example of abnormal neuronal excitability is hyperexcitability and it is important to understand the biological factors which lead to such excess of excitability and to identify (both in detailed biophysical models and reduced phenomenological ones) the mathematical structures leading to these anomalies. Hyperexcitability is one important trigger for pathological brain states related to various diseases such as chronic migraine 50, epilepsy 51 or even Alzheimer's Disease 49. A central central axis of research within our group is to revisit models of such pathological scenarios, in relation with a combination of advanced mathematical tools and in partnership with biological labs.
3.6 Synaptic Plasticity
Neural networks show amazing abilities to evolve and adapt, and to store and process information. These capabilities are mainly conditioned by plasticity mechanisms, and especially synaptic plasticity, inducing a mutual coupling between network structure and neuron dynamics. Synaptic plasticity occurs at many levels of organization and time scales in the nervous system 37. It is of course involved in memory and learning mechanisms, but it also alters excitability of brain areas and regulates behavioral states (e.g., transition between sleep and wakeful activity). Therefore, understanding the effects of synaptic plasticity on neurons dynamics is a crucial challenge.
Our group is developing mathematical and numerical methods to analyse this mutual interaction. On the one hand, we have shown that plasticity mechanisms 10, 17, Hebbianlike or STDP, have strong effects on neuron dynamics complexity, such as synaptic and propagation delays 21, dynamics complexity reduction, and spike statistics.
3.7 Memory processes
The processes by which memories are formed and stored in the brain are multiple and not yet fully understood. What is hypothesised so far is that memory formation is related to the activation of certain groups of neurons in the brain. Then, one important mechanism to store various memories is to associate certain groups of memory items with one another, which then corresponds to the joint activation of certain neurons within different subgroup of a given population. In this framework, plasticity is key to encode the storage of chains of memory items. Yet, there is no general mathematical framework to model the mechanism(s) behind these associative memory processes. We are aiming at developing such a framework using our expertise in multiscale modelling, by combining the concepts of heteroclinic dynamics, slowfast dynamics and stochastic dynamics.
The general objective that we wish to pursue in this project is to investigate nonequilibrium phenomena pertinent to storage and retrieval of sequences of learned items. In previous work by team members 9, 1, 15, it was shown that with a suitable formulation, heteroclinic dynamics combined with slowfast analysis in neural field systems can play an organizing role in such processes, making the model accessible to a thorough mathematical analysis. Multiple choice in cognitive processes require a certain flexibility in the neural network, which has recently been investigated in the submitted paper 31.
Our goal is to contribute to identify general processes under which cognitive functions can be organized in the brain.
4 Application domains
The project underlying MathNeuro revolves around pillars of neuronal behaviour –excitability, plasticity, memory– in link with the initiation and propagation of pathological brain states in diseases such as cortical spreading depression (in link with certain forms of migraine with aura), epileptic seizures and Alzheimer’s Disease. Our work on memory processes can also potentially be applied to studying mental disorders such as schizophrenia or obsessive disorder troubles.
5 Highlights of the year
The first three PhD students of the EPI MathNeuro, Louisiane Lemaire, Yuri Rodrigues and Halgurd Taher, have all successfully defended their respective PhD in 2021. They all found postdoctoral positions, at Humbolt University (Berlin, Germany), the University of Sussex (Brighton, UK) and Charité Medical University (Berlin, Germany), respectively.
Simona Olmi, who held a Starting Research Position (SRP) in MathNeuro from 2018 to 2021, has obtained a permanent position as researcher in Computational Neuroscience at the National Research Council (CRN) in Florence, Italy.
6 New results
6.1 Neural Networks as dynamical systems
6.1.1 Patientspecific network connectivity combined with a next generation neural mass model to test clinical hypothesis of seizure propagation
Participants: Moritz Gerster [Institute for Theoretical Physics, Germany], Halgurd Taher [National Institute of Mental Heath, Czech Republic], Antonín Škoch [National Institute of Mental Heath, Czech Republic], Jaroslav Hlinka [National Institute of Mental Heath, Czech Republic], Maxime Guye [Centre de Résonance Magnétique Biologique et Médicale, France], Fabrice Bartolomei [Hôpital de la Timone, France], Viktor Jirsa [Institut de Neuroscience des Systèmes, France], Anna Zakharova [Institute for Theoretical Physics, Germany], Simona Olmi.
Dynamics underlying epileptic seizures span multiple scales in space and time, therefore understanding seizure mechanisms requires identifying the relations between seizure components within and across these scales, together with the analysis of their dynamical repertoire. In this view, mathematical models have been developed, ranging from single neuron to neural population. In this study, we consider a neural mass model able to exactly reproduce the dynamics of heterogeneous spiking neural networks. We combine mathematical modeling with structural information from non invasive brain imaging, thus building largescale brain network models to explore emergent dynamics and test the clinical hypothesis. We provide a comprehensive study on the effect of external drives on neuronal networks exhibiting multistability, in order to investigate the role played by the neuroanatomical connectivity matrices in shaping the emergent dynamics. In particular, we systematically investigate the conditions under which the network displays a transition from a low activity regime to a high activity state, which we identify with a seizurelike event. This approach allows us to study the biophysical parameters and variables leading to multiple recruitment events at the network level. We further exploit topological network measures in order to explain the differences and the analogies among the subjects and their brain regions, in showing recruitment events at different parameter values. We demonstrate, along with the example of diffusionweighted magnetic resonance imaging (dMRI) connectomes of 20 healthy subjects and 15 epileptic patients, that individual variations in structural connectivity, when linked with mathematical dynamic models, have the capacity to explain changes in spatiotemporal organization of brain dynamics, as observed in networkbased brain disorders. In particular, for epileptic patients, by means of the integration of the clinical hypotheses on the epileptogenic zone (EZ), i.e., the local network where highly synchronous seizures originate, we have identified the sequence of recruitment events and discussed their links with the topological properties of the specific connectomes. The predictions made on the basis of the implemented set of exact meanfield equations turn out to be in line with the clinical presurgical evaluation on recruited secondary networks.This work has been published in Frontiers in Systems Neuroscience and is available as 27.
6.2 Mean field theory and stochastic processes
6.2.1 Crossscale excitability in networks of quadratic integrateandfire neurons
Participants: Daniele Avitabile [VU Amsterdam, Netherlands, Inria MathNeuro], Mathieu Desroches [University of Pittsburgh, USA], G Bard Ermentrout [University of Pittsburgh, USA].
From the action potentials of neurons and cardiac cells to the amplification of calcium signals in oocytes, excitability is a hallmark of many biological signalling processes. In recent years, excitability in single cells has been related to multipletimescale dynamics through canards, special solutions which determine the effective thresholds of the allornone responses. However, the emergence of excitability in large populations remains an open problem. Here, we show that the mechanisms of excitability in an infinite heterogeneous population of coupled quadratic integrate and fire (QIF) cells maintains echoes of the mechanism for the individual components. We exploit the OttAntonsen ansatz to derive lowdimensional dynamics for the coupled network and use it to describe the structure of canards via slow periodic forcing. We demonstrate that the thresholds for onset and offset of population firing can be found in the same way as those of the single cell. We combine theoretical and numerical analysis to develop a novel and comprehensive framework for excitability in large populations.This work has been submitted for publication and is available as 29.
6.2.2 Coherence resonance in neuronal populations: meanfield versus network model
Participants: Emre Baspinar [TU Berlin, Germany], Leonard Schülen [TU Berlin, Germany], Simona Olmi [TU Berlin, Germany], Anna Zakharova [TU Berlin, Germany].
The counterintuitive phenomenon of coherence resonance describes a nonmonotonic behavior of the regularity of noiseinduced oscillations in the excitable regime, leading to an optimal response in terms of regularity of the excited oscillations for an intermediate noise intensity. We study this phenomenon in populations of FitzHughNagumo (FHN) neurons with different coupling architectures. For networks of FHN systems in excitable regime, coherence resonance has been previously analyzed numerically. Here we focus on an analytical approach studying the meanfield limits of the locally and globally coupled populations. The meanfield limit refers to the averaged behavior of a complex network as the number of elements goes to infinity. We derive a meanfield limit approximating the locally coupled FHN network with low noise intensities. Further, we apply meanfield approach to the globally coupled FHN network. We compare the results of the meanfield and network frameworks for coherence resonance and find a good agreement in the globally coupled case, where the correspondence between the two approaches is sufficiently good to capture the emergence of anticoherence resonance. Finally, we study the effects of the coupling strength and noise intensity on coherence resonance for both the network and the meanfield model.This work has been published in Physical Review E and is available as 24.
6.2.3 The one step fixedlag particle smoother as a strategy to improve the prediction step of particle filtering
Participants: Samuel Nyobe [University of Yaoundé, Cameroun], Fabien Campillo [University of Yaoundé, Cameroun], Serge Moto [University of Yaoundé, Cameroun], Vivien Rossi [CIRAD].
Sequential Monte Carlo methods have been a major breakthrough in the field of numerical signal processing for stochastic dynamical statespace systems with partial and noisy observations. However, these methods still present certain weaknesses. One of the most fundamental is the degeneracy of the filter due to the impoverishment of the particles: the prediction step allows the particles to explore the statespace and can lead to the impoverishment of the particles if this exploration is poorly conducted or when it conflicts with the following observation that will be used in the evaluation of the likelihood of each particle. In this article, in order to improve this last step within the framework of the classic bootstrap particle filter, we propose a simple approximation of the one step fixedlag smoother. At each time iteration, we propose to perform additional simulations during the prediction step in order to improve the likelihood of the selected particles.This work has been submitted for publication and is available as 32.
6.2.4 A stochastic model of hippocampal synaptic plasticity with geometrical readout of enzyme dynamics
Participants: Yuri Rodrigues [Cardiff University, UK], Cezar Tigaret [Cardiff University, UK], Hélène Marie [Institut de Pharmacologie Moléculaire et Cellulaire, France], Cian O'Donnell [University of Bristol, UK], Romain Veltz.
Discovering the rules of synaptic plasticity is an important step for understanding brain learning. Existing plasticity models are either 1) topdown and interpretable, but not flexible enough to account for experimental data, or 2) bottomup and biologically realistic, but too intricate to interpret and hard to fit to data. We fill the gap between these approaches by uncovering a new plasticity rule based on a geometrical readout mechanism that flexibly maps synaptic enzyme dynamics to plasticity outcomes. We apply this readout to a multitimescale model of hippocampal synaptic plasticity induction that includes electrical dynamics, calcium, CaMKII and calcineurin, and accurate representation of intrinsic noise sources. Using a single set of model parameters, we demonstrate the robustness of this plasticity rule by reproducing nine published ex vivo experiments covering various spiketiming and frequencydependent plasticity induction protocols, animal ages, and experimental conditions. The model also predicts that in vivolike spike timing irregularity strongly shapes plasticity outcome. This geometrical readout modelling approach can be readily applied to other excitatory or inhibitory synapses to discover their synaptic plasticity rules.This work has been submitted for publication and is available as 33.
6.2.5 Slowfast dynamics in the meanfield limit of neural networks
Participants: Daniele Avitabile [VU Amsterdam, Inria MathNeuro], Emre Baspinar, Fabien Campillo, Mathieu Desroches, Olivier Faugeras.
In the context of the Human Brain Project (HBP, see section 5.1.1.1. below), we have recruited Emre Baspinar in December 2018 for a twoyear postdoc, after which he stayed for a few more months on an engineer contract in order to complete some papers. Within MathNeuro, Emre has worked on analysing slowfast dynamical behaviours in the meanfield limit of neural networks.In a first project, he has been analysing the slowfast structure in the meanfield limit of a network of FitzHughNagumo neuron models; the meanfield was previously established in 3 but its slowfast aspect had not been analysed. In particular, he has proved a persistence result of Fenichel type for slow manifolds in this meanfield limit, thus extending previous work by Berglund et al. 36, 35. A manuscript is in preparation.
In a second project, he has been looking at a network of WilsonCowan systems whose meanfield limit is an ODE, and he has studied elliptic bursting dynamics in both the network and the limit: its slowfast dissection, its singular limits and the role of canards. In passing, he has obtained a new characterisation of ellipting bursting via the construction of periodic limit sets using both the slow and the fast singular limits and unravelled a new singularlimit scenario giving rise to elliptic bursting via a new type of torus canard orbits. A manuscript has been published in Chaos and is available as 22 (see below).
6.2.6 Theoretical and Computational methods for Stochastic Differential Equations describing networks of Hopfield neurons
Participants: Olivier Faugeras [NJIT, USA], James MacLaurin [NJIT, USA], Etienne Tanré [Inria].
We investigate large networks of Hopfield neurons under various assumptions about the underlying network graphs (completely connected, sparsely connected, small world), the synaptic coefficients (Gaussian or nonGaussian distributed, independent, correlated), and the neuronal populations present in the network (excitatory, inhibitory, balanced). These assumptions generate a large variety of different behaviours that we are trying to analyse mathematically and numerically. The mathematics include the description of the thermodynamics (meanfield) limit of these networks, when they exist, the type of solutions of the limit equations, their bifurcations with respect to changes of the network parameters, the fluctuations of the solutions to the network equations around the thermodynamics limit to understand the finite size effects. Along with these theoretical efforts, we are developing simulation tools in the Julia language with an eye on parallel implementations on GPUs to develop an intuition for the behaviours of these networks and guide the mathematical analysis.6.3 Neural fields theory
6.3.1 A corticalinspired subRiemannian model for Poggendorfftype visual illusions
Participants: Emre Baspinar [Inria Morpheme], Luca Calatroni [Inria Morpheme], Valentina Franceschi [University of Padova, Italy], Dario Prandi [CNRS, Université ParisSaclay, CentraleSupélec].
We consider WilsonCowantype models for the mathematical description of orientationdependent Poggendorfflike illusions. Our modelling improves two previously proposed corticalinspired approaches, embedding the subRiemannian heat kernel into the neuronal interaction term, in agreement with the intrinsically anisotropic functional architecture of V1 based on both local and lateral connections. For the numerical realisation of both models, we consider standard gradient descent algorithms combined with Fourierbased approaches for the efficient computation of the subLaplacian evolution. Our numerical results show that the use of the subRiemannian kernel allows us to reproduce numerically visual misperceptions and inpaintingtype biases in a stronger way in comparison with the previous approaches.This work has been published in Journal of Imaging and is available as 23.
6.4 Slowfast dynamics in Neuroscience
6.4.1 Canonical models for torus canards in elliptic burster
Participants: Emre Baspinar [VU Amsterdam, Inria MathNeuro], Daniele Avitabile [VU Amsterdam, Inria MathNeuro], Mathieu Desroches.
We revisit elliptic bursting dynamics from the viewpoint of torus canard solutions. We show that at the transition to and from elliptic burstings, classical or mixedtype torus canards can appear, the difference between the two being the fast subsystem bifurcation that they approach, saddlenode of cycles for the former and subcritical Hopf for the latter. We first showcase such dynamics in a WilsonCowan type elliptic bursting model, then we consider minimal models for elliptic bursters in view of finding transitions to and from bursting solutions via both kinds of torus canards. We first consider the canonical model proposed by Izhikevich (ref. [22] in the manuscript) and adapted to elliptic bursting by Ju, Neiman, Shilnikov (ref. [24] in the manuscript), and we show that it does not produce mixedtype torus canards due to a nongeneric transition at one end of the bursting regime. We therefore introduce a perturbative term in the slow equation, which extends this canonical form to a new one that we call Leidenator and which supports the right transitions to and from elliptic bursting via classical and mixedtype torus canards, respectively. Throughout the study, we use singular flows ($\epsilon =0$) to predict the full system's dynamics ($\epsilon >0$ small enough). We consider three singular flows: slow, fast and average slow, so as to appropriately construct singular orbits corresponding to all relevant dynamics pertaining to elliptic bursting and torus canards.This work has been published in Chaos and is available as 22.
6.4.2 Spikeadding and resetinduced canard cycles in adaptive integrate and fire models
Participants: Mathieu Desroches [University of Wrocław, Poland], Piotr Kowalczyk [University of Wrocław, Poland], Serafim Rodrigues [Basque Center for Applied Mathematics, Spain].
We study a class of planar integrate and fire models called adaptive integrate and fire (AIF) models, which possesses an adaptation variable on top of membrane potential, and whose subthreshold dynamics is piecewise linear . These AIF models therefore have two reset conditions, which enable bursting dynamics to emerge for suitable parameter values. Such models can be thought of as hybrid dynamical systems. We consider a particular slow dynamics within AIF models and prove the existence of bursting cycles with N resets, for any integer N. Furthermore, we study the transition between N and (N + 1)reset cycles upon vanishingly small parameter variations and prove (for N = 2) that such transitions are organised by canard cycles. Finally, using numerical continuation we compute branches of bursting cycles, including canardexplosive branches, in these AIF models, by suitably recasting the periodic problem as a twopoint boundaryvalue problem.This work has been published in Nonlinear Dynamics and is available as 26.
6.4.3 Bursting in a next generation neural mass model with synaptic dynamics: a slowfast approach
Participants: Halgurd Taher [VU Amsterdam, Netherlands, Inria MathNeuro], Daniele Avitabile [VU Amsterdam, Netherlands, Inria MathNeuro], Mathieu Desroches.
We report a detailed analysis on the emergence of bursting in a recently developed neural mass model that takes shortterm synaptic plasticity into account. The one being used here is particularly important, as it represents an exact meanfield limit of synaptically coupled quadratic integrate & fire neurons, a canonical model for type I excitability. In absence of synaptic dynamics, a periodic external current with a slow frequency $\epsilon $ can lead to burstlike dynamics. The firing patterns can be understood using techniques of singular perturbation theory, specifically slowfast dissection. In the model with synaptic dynamics the separation of timescales leads to a variety of slowfast phenomena and their role for bursting is rendered inordinately more intricate. Canards are one of the main slowfast elements on the route to bursting. They describe trajectories evolving nearby otherwise repelling locally invariant sets of the system and are found in the transition region from subthreshold dynamics to bursting. For values of the timescale separation nearby the singular limit $\epsilon =0$, we report peculiar jumpon canards, which block a continuous transition to bursting. In the biologically more plausible regime of $\epsilon $ this transition becomes continuous and bursts emerge via consecutive spikeadding transitions. The onset of bursting is of complex nature and involves mixedtype like torus canards, which form the very first spikes of the burst and revolve nearby fastsubsystem repelling limit cycles. We provide numerical evidence for the same mechanisms to be responsible for the emergence of bursting in the quadratic integrate & fire network with plastic synapses. The main conclusions apply for the network, owing to the exactness of the meanfield limit.This work has been submitted for publication and is available as 34.
6.5 Mathematical modeling of neuronal excitability
6.5.1 Initiation of migrainerelated cortical spreading depolarization by hyperactivity of GABAergic neurons and NaV1.1 channels
Participants: Oana Chever [IPMC, Sophia Antipolis et Université Côte d'Azur, France], Sarah Zerimeh [IPMC, Sophia Antipolis et Université Côte d'Azur, France], Paolo Scalmani [Foundation IRCCS Neurological Institute Carlo Besta, Milan, Italy], Louisiane Lemaire [IPMC, Sophia Antipolis et Université Côte d'Azur, France], Lara Pizzamiglio [IPMC, Sophia Antipolis et Université Côte d'Azur, France], Alexandre Loucif [IPMC, Sophia Antipolis et Université Côte d'Azur, France], Marion Ayrault [IPMC, Sophia Antipolis et Université Côte d'Azur, France], Martin Krupa [LJAD, Inria MathNeuro], Mathieu Desroches [IPMC, Sophia Antipolis et Université Côte d'Azur, France], Fabrice Duprat [IPMC, Sophia Antipolis et Université Côte d'Azur, France], Isabelle Léna [IPMC, Sophia Antipolis et Université Côte d'Azur, France], Sandrine Cestèle [IPMC, Sophia Antipolis et Université Côte d'Azur, France], Massimo Mantegazza [IPMC, Sophia Antipolis, Université Côte d'Azur et INSERM, France].
Spreading depolarizations (SDs) are involved in migraine, epilepsy, stroke, traumatic brain injury, and subarachnoid hemorrhage. However, the cellular origin and specific differential mechanisms are not clear. Increased glutamatergic activity is thought to be the key factor for generating cortical spreading depression (CSD), a pathological mechanism of migraine. Here, we show that acute pharmacological activation of $Na{V}_{1.1}$ (the main Na${}^{+}$ channel of interneurons) or optogeneticinduced hyperactivity of GABAergic interneurons is sufficient to ignite CSD in the neocortex by spikinggenerated extracellular K${}^{+}$ buildup. Neither GABAergic nor glutamatergic synaptic transmission were required for CSD initiation. CSD was not generated in other brain areas, suggesting that this is a neocortexspecific mechanism of CSD initiation. Gainoffunction mutations of $Na{V}_{1.1}$ (SCN1A) cause familial hemiplegic migraine type3 (FHM3), a subtype of migraine with aura, of which CSD is the neurophysiological correlate. Our results provide the mechanism linking $Na{V}_{1.1}$ gain of function to CSD generation in FHM3. Thus, we reveal the key role of hyperactivity of GABAergic interneurons in a mechanism of CSD initiation, which is relevant as a pathological mechanism of Nav1.1 FHM3 mutations, and possibly also for other types of migraine and diseases in which SDs are involved.This work has been published in Journal of Clinical Investigation and is available as 25.
The extension of this work is the topic of the PhD of Louisiane Lemaire, who started in October 2018 and was successfully defended in December 2021. A first part of Louisiane's PhD has been to improve and extend the model published in 11 in a number of ways: replace the GABAergic neuron model used in 11, namely the WangBuszáki model, by a more recent fastspiking cortical interneuron model due to Golomb and collaborators; implement the effect of the HM1a toxin used by M. Mantegazza to mimic the genetic mutation of sodium channels responsible for the hyperactivity of the GABAergic neurons; take into account ionic concentration dynamics (relaxing the hypothesis of constant reversal potentials) for the GABAergic as well whereas in 11 this was done only for the Pyramidal neuron. Furthermore, another mutation of this sodium channel leads to hyperactivity of the pyramidal neurons in a way that is akin to epileptiform activity. The model by Louisiane Lemaire has been extended in order to account for this pathological scenario as well. This required a great deal of modelling and calibration and the simulation results are closer to the actual experiments by Mantegazza than in our previous study. An article has been published (28, see below).
6.5.2 Modeling NaV1.1/SCN1A sodium channel mutations in a microcircuit with realistic ion concentration dynamics suggests differential GABAergic mechanisms leading to hyperexcitability in epilepsy and hemiplegic migraine
Participants: Louisiane Lemaire [LJAD, Inria MathNeuro], Martin Krupa [LJAD, Inria MathNeuro], Mathieu Desroches [IPMC, Sophia Antipolis, Université Côte d'Azur et INSERM, France], Lara Pizzamiglio [IPMC, Sophia Antipolis, Université Côte d'Azur et INSERM, France], Paolo Scalmani [Foundation IRCCS Neurological Institute Carlo Besta, Milan, Italy], Massimo Mantegazza [IPMC, Sophia Antipolis, Université Côte d'Azur et INSERM, France].
Loss of function mutations of SCN1A, the gene coding for the voltagegated sodium channel $Na{V}_{1.1}$, cause different types of epilepsy, whereas gain of function mutations cause sporadic and familial hemiplegic migraine type 3 (FHM3). However, it is not clear yet how these opposite effects can induce paroxysmal pathological activities involving neuronal networks’ hyperexcitability that are specific of epilepsy (seizures) or migraine (cortical spreading depolarization, CSD). To better understand differential mechanisms leading to the initiation of these pathological activities, we used a twoneuron conductancebased model of interconnected GABAergic and pyramidal glutamatergic neurons, in which we incorporated ionic concentration dynamics in both neurons. We modeled FHM3 mutations by increasing the persistent sodium current in the interneuron and epileptogenic mutations by decreasing the sodium conductance in the interneuron. Therefore, we studied both FHM3 and epileptogenic mutations within the same framework, modifying only two parameters. In our model, the key effect of gain of function FHM3 mutations is ion fluxes modification at each action potential (in particular the larger activation of voltagegated potassium channels induced by the NaV1.1 gain of function), and the resulting CSDtriggering extracellular potassium accumulation, which is not caused only by modifications of firing frequency. Loss of function epileptogenic mutations, on the other hand, increase GABAergic neurons’ susceptibility to depolarization block, without major modifications of firing frequency before it. Our modeling results connect qualitatively to experimental data: potassium accumulation in the case of FHM3 mutations and facilitated depolarization block of the GABAergic neuron in the case of epileptogenic mutations. Both these effects can lead to pyramidal neuron hyperexcitability, inducing in the migraine condition depolarization block of both the GABAergic and the pyramidal neuron. Overall, our findings suggest different mechanisms of network hyperexcitability for migraine and epileptogenic $Na{V}_{1.1}$ mutations, implying that the modifications of firing frequency may not be the only relevant pathological mechanism.This work has been published in PLoS Computational Biology and is available as 28.
6.5.3 Mathematical modeling of the neurotransmitter cycle, in link with neuronal excitability and plasticity in both healthy and pathological states
Participants: Afia B Ali [School of Pharmacy, University College London, UK], Mathieu Desroches, Serafim Rodrigues, Mattia Sensi.
The project is part of a longstanding collaboration between Mathieu Desroches (MathNeuro ProjectTeam, Inria) and Serafim Rodrigues (MCEN research group, Basque Center for Applied Mathematics, Bilbao, Spain) on neurotransmission and its possible disruptions 17 (see also 10). Matti Sensi has been recruited as postdoc in MathNeuro at the end of 2021 to work on this project. The work will be organised around a modeling project on multitimescale aspects of synaptic transmission, which can be synchronous, asynchronous or spontaneous. Building on our already existing modeling platform, we will focus on two main extensions: the endocytotic part of the neurotransmitter cycle, and the integration of glial activity into the model. This project is also related to the NeuroTransSF associated team between MathNeuro and MCEN, and the extended model will be used (within a time horizon that goes beyond the postdoc of Mattia Sensi) in the context of experiments performed by Afia Ali on early effects of Alzheimer's Disease.6.6 Mathematical modeling of memory processes
6.6.1 Dynamic branching in a neural network model for probabilistic prediction of sequences
Participants: Elif Köksal Ersöz [INSERM, Rennes], Pascal Chossat [CNRS and Inria MathNeuro], Martin Krupa [LJAD and Inria MathNeuro], Frédéric Lavigne [BCL, Université Côte d'Azur].
An important function of the brain is to adapt behavior by selecting between different predictions of sequences of stimuli likely to occur in the environment. The present research studied the branching behavior of a computational network model of populations of excitatory and inhibitory neurons, both analytically and through simulations. Results show how synaptic efficacy, retroactive inhibition and shortterm synaptic depression determine the dynamics of choices between different predictions of sequences having different probabilities. Further results show that changes in the probability of the different predictions depend on variations of neuronal gain. Such variations allow the network to optimize the probability of its predictions to changing probabilities of the sequences without changing synaptic efficacy.This work has been submitted for publication and is available as 31.
6.7 Modelling the visual system
6.7.1 Spatial and color hallucinations in a mathematical model of primary visual cortex
Participants: Olivier Faugeras [Imperial College London, UK], Anna Song [Imperial College London, UK], Romain Veltz.
We study a simplified model of the representation of colors in the primate primary cortical visual area V1. The model is described by an initial value problem related to a Hammerstein equation. The solutions to this problem represent the variation of the activity of populations of neurons in V1 as a function of space and color. The two space variables describe the spatial extent of the cortex while the two color variables describe the hue and the saturation represented at every location in the cortex. We prove the wellposedness of the initial value problem. We focus on its stationary, i.e. independent of time, and periodic in space solutions. We show that the model equation is equivariant with respect to the direct product G of the group of the Euclidean transformations of the planar lattice determined by the spatial periodicity and the group of color transformations, isomorphic to $O\left(2\right)$, and study the equivariant bifurcations of its stationary solutions when some parameters in the model vary. Their variations may be caused by the consumption of drugs and the bifurcated solutions may represent visual hallucinations in space and color. Some of the bifurcated solutions can be determined by applying the Equivariant Branching Lemma (EBL) by determining the axial subgroups of G. These define bifurcated solutions which are invariant under the action of the corresponding axial subgroup. We compute analytically these solutions and illustrate them as color images. Using advanced methods of numerical bifurcation analysis we then explore the persistence and stability of these solutions when varying some parameters in the model. We conjecture that we can rely on the EBL to predict the existence of patterns that survive in large parameter domains but not to predict their stability. On our way we discover the existence of spatially localized stable patterns through the phenomenon of "snaking".This work has been accepted for publication in the "Comptes Rendus Mathématique" and is available as 30.
7 Partnerships and cooperations
Participants: Fabien Campillo [cotutelle PhD Student, BCAM, Spain and Inria MathNeuro], Mathieu Desroches [cotutelle PhD Student, BCAM, Spain and Inria MathNeuro], Guillaume Girier [cotutelle PhD Student, BCAM, Spain and Inria MathNeuro], Serafim Rodrigues [ikerbasque & BCAM, Spain].
7.1 International initiatives
7.1.1 Inria associated team not involved in an IIL or an international program
NeuroTransSF

Title:
NeuroTransmitter cycle: A SlowFast modeling approach

Duration:
from 2019 to 2022

Coordinator:
Serafim Rodrigues (srodrigues@bcamath.org)

Partners:
Basque Center for Applied Mathematics (BCAM, Bilbao, Spain)

Inria contact:
Mathieu Desroches

Summary:
This associated team project proposes to deepen the links between two young research groups, on strong Neuroscience thematics. This project aims to start from a joint work in which we could successfully model synaptic transmission delays for both excitatory and inhibitory synapses, matching experimental data, and to supplant it in two distinct directions. On the one hand, by modeling the endocytosis so as to obtain a complete mathematical formulation of the presynaptic neurotransmitter cycle, which will then be integrated within diverse neuron models (in particular interneurons) hence allowing a refined analysis of their excitability and shortterm plasticity properties. On the other hand, by modeling the postsynaptic neurotransmitter cycle in link with longterm plasticity and memory. We will incorporate these new models of synapse in different types of neuronal networks and we will then study their excitability, plasticity and synchronization properties in comparison with classical models. This project will benefit from strong experimental collaborations (UCL, Alicante) and it is coupled to the study of brain pathologies linked with synaptic dysfunctions, in particular certain early signs of Alzheimer’s Disease. Our initiative also contains a training aspect with two PhD student involved as well as a series of minicourses which we will propose to the partner institute on this research topic; we will also organize a "wrapup" workshop in Sophia at the end of it. Finally, the project is embedded within a strategic tightening of our links with Spain with the objective of pushing towards the creation of a SouthernEurope network for Mathematical, Computational and Experimental Neuroscience, which will serve as a stepping stone in order to extend our influence beyond Europe. The web page of the associated team is here. It contains the latest developments and new collaborations that have emerged from this project.
7.2 International research visitors
7.2.1 Visits of international scientists
Other international visits to the team
Ernest Montbrió

Status
(Associate Professor)

Institution of origin:
Universitat Pompeu Fabra, Barcelona

Country:
Spain

Dates:
810 December 2021

Context of the visit:
He visited to participate to the thesis jury of Halgurd Taher (he was one of the reviewers of the thesis) as well as to initiate a research collaboration with Mathieu Desroches.

Mobility program/type of mobility:
research stay
7.2.2 Visits to international teams
Research stays abroad
Mathieu Desroches

Visited institution:
BCAM, Bilbao

Country:
Spain

Dates:
1625 October 2021

Context of the visit:
Collaboration with S. Rodrigues on the associated team NeuroTransSF project.

Mobility program/type of mobility:
research stay
7.3 European initiatives
7.3.1 FP7 & H2020 projects
Human Brain Project (HBP)

Title:
Human Brain Project Specific Grant Agreement 3

Duration:
3 years (March 2020  March 2023)

Coordinator:
EPFL

Partners:
See the web page of the project. Olivier Faugeras is leading the task T4.1.3 entitled “Meanfield and population models” of the Worpackage W4.1 “Bridging Scales”.

Summary:
Understanding the human brain is one of the greatest challenges facing 21st century science. If we can rise to the challenge, we can gain profound insights into what makes us human, develop new treatments for brain diseases and build revolutionary new computing technologies. Today, for the first time, modern ICT has brought these goals within sight. The goal of the Human Brain Project, part of the FET Flagship Programme, is to translate this vision into reality, using ICT as a catalyst for a global collaborative effort to understand the human brain and its diseases and ultimately to emulate its computational capabilities. The Human Brain Project will last ten years and will consist of a rampup phase (from month 1 to month 36) and subsequent operational phases. This Grant Agreement covers the rampup phase. During this phase the strategic goals of the project will be to design, develop and deploy the first versions of six ICT platforms dedicated to Neuroinformatics, Brain Simulation, High Performance Computing, Medical Informatics, Neuromorphic Computing and Neurorobotics, and create a user community of research groups from within and outside the HBP, set up a European Institute for Theoretical Neuroscience, complete a set of pilot projects providing a first demonstration of the scientific value of the platforms and the Institute, develop the scientific and technological capabilities required by future versions of the platforms, implement a policy of Responsible Innovation, and a programme of transdisciplinary education, and develop a framework for collaboration that links the partners under strong scien tific leadership and professional project management, providing a coherent European approach and ensuring effective alignment of regional, national and European research and programmes. The project work plan is organized in the form of thirteen subprojects, each dedicated to a specific area of activity. A significant part of the budget will be used for competitive calls to complement the collective skills of the Consortium with additional expertise.
8 Dissemination
8.1 Promoting scientific activities
8.1.1 Scientific events: organisation
General chair, scientific chair

Fabien Campillo
is a founding member of the African scholarly Society on Digital Sciences (ASDS).
Member of the organizing committees

Mathieu Desroches
was a member of the organizing committee of the international conference “Dynamics Days Europe” 2021, held in Nice on 2327 August 2021.

Louisiane Lemaire
was a member of the local organizing committee of the international conference “Dynamics Days Europe” 2021, held in Nice on 2327 August 2021.
 Louisiane Lemaire

Halgurd Taher
was a member of the local organizing committee of the international conference “Dynamics Days Europe” 2021, held in Nice on 2327 August 2021.
 Halgurd Taher
8.1.2 Journal
Member of the editorial boards

Mathieu Desroches
is Review Editor of the journal Frontiers in Physiology (ImpactFactor 3.4).

Olivier Faugeras
is founder and editorinchief of the journal Mathematical Neuroscience and Applications, under the diamond model (no charges for authors and readers) supported by Episciences.
Reviewer  reviewing activities

Fabien Campillo
acted as a reviewer for the Journal of Mathematical Biology.

Mathieu Desroches
acted as a reviewer for the journals SIAM Journal on Applied Dynamical Systems (SIADS), Journal of Mathematical Biology, PLoS Computational Biology, Journal of Nonlinear Science, Nonlinear Dynamics, Physica D, Applied Mathematics Modelling, Journal of the European Mathematical Society.

Martin Krupa
acted as a reviewer for Proceedings of the National Academy of Sciences of the USA and SIAM Journal on Applied Dynamical Systems (SIADS).
8.1.3 Invited talks

Mathieu Desroches
gave a webinar talk entitled "Canards in neural networks and their meanfield limits" at the Lowlands dynamics Seminar of VU Amsterdam, 25 November 2021

Louisiane Lemaire
gave a webinar talk entitled "Mathematical Model of the Mutations of a Sodium Channel (NaV1.1) Capturing Both Migraine and Epilepsy Scenarios" at the SIAM Conference on Applications of Dynamical Systems (DS21), 27 May 2021.

Louisiane Lemaire
gave a webinar talk entitled "Mathematical Model of the Mutations of a Sodium Channel (NaV1.1) Capturing Both Migraine and Epilepsy Scenarios" at

the virtual Society for Mathematical Biology annual meeting (SMB21), 14 June 2021;

the 2021 International Conference on Mathematical Neuroscience  Digital Edition, 28 June 2021;

the International Conference on Spreading Depolarization, 21 September 2021.



Halgurd Taher
gave a webinar talk entitled "Emergence of Bursting and SlowFast Phenomena in a Next Generation Neural Mass Model with ShortTerm Synaptic Plasticity" at the SIAM Conference on Applications of Dynamical Systems (DS21), 27 May 2021.

Halgurd Taher
gave a webinar talk entitled "Bursting in a next generation neural mass model with synaptic dynamics: a slowfast approach" at the virtual Society for Mathematical Biology annual meeting (SMB21), 15 June 2021.

Halgurd Taher
gave a webinar talk entitled "Exact neural mass model for synapticbased working memory" at

the 2021 International Conference on Mathematical Neuroscience  Digital Edition, 28 June 2021;

the 2021 CNS conference, 7 July 2021;

the Dynamics Days Europe 2021 Conference, 27 August 2021;

the Bernstein Conference, 22 September 2021.


8.1.4 Leadership within the scientific community

Fabien Campillo
was member of the local committee in charge of the scientific selection of visiting scientists (Comité NICE).

Mathieu Desroches
was on the Scientific Committee of the Complex Systems Academy of the UCA JEDI Idex.
8.2 Teaching  Supervision  Juries
8.2.1 Teaching

Master:
Emre Baspinar, Dynamical Systems in the context of neuron models, (Lectures, example classes and computer labs), 6 hours (Jan. 2021), M1 (Mod4NeuCog), Université Côte d'Azur, Sophia Antipolis, France.

Master:
Mathieu Desroches, Modèles Mathématiques et Computationnels en Neuroscience (Lectures, example classes and computer labs), 12 hours (Feb. 2021), M1 (BIM), Sorbonne Université, Paris, France.

Master:
Mathieu Desroches, Dynamical Systems in the context of neuron models, (Lectures, example classes and computer labs), 6 hours (Jan. 2021) and 9 hours (Nov.Dec. 2021), M1 (Mod4NeuCog), Université Côte d'Azur, Sophia Antipolis, France.

Master:
Louisiane Lemaire, Modèles Mathématiques et Computationnels en Neuroscience (example classes and computer labs), 6 hours (Feb 2021), M1 (BIM), Sorbonne Université, Paris, France.
8.2.2 Supervision

PhD completed:
Louisiane Lemaire, “Multiscale mathematical modeling of cortical spreading depression”, cosupervised by Mathieu Desroches and Martin Krupa, was successfully defended on 13 December 2021.

PhD completed:
Yuri Rodrigues, "Unifying experimental heterogeneity in a geometrical synaptic plasticity model”, cosupervised by Romain Veltz and Hélène Marie (IPMC, Sophia Antipolis), was successfully defended on 30 June 2021.

PhD completed:
Halgurd Taher, "Next generation neural mass models: working memory, allbrain modelling and multitimescale phenomena", cosupervised by Mathieu Desroches and Simona Olmi, was successfully defended on 9 December 2021.

Master 1 internship:
Efstathios Pavlidis, "Analysis of Mixed Mode Oscillations using multiple timescale dynamics: two case studies", cosupervised by Emre Baspinar, Mathieu Desroches and Martin Krupa, AprilMay 2020.

Master 2 internship:
Efstathios Pavlidis, "Multipletimescale dynamics and mixed states in a model of Bipolar Disorder", supervised by Mathieu Desroches, September 2021  February 2022.
8.2.3 Juries

Mathieu Desroches
was member of the jury and reviewer of the PhD of Mattia Sensi entitled "A Geometric Singular Perturbation approach to epidemic compartmental models", supervised by Andrea Pugliese, University of Trento (Italy), 18 January 2021.

Mathieu Desroches
was member of the jury of the PhD of Lou Zonca entitled "Modeling and analytical computations of burst and interburst dynamics in neuronal networks, applications to neuronglia interactions and oscillatory brain rhythms", supervised by David Holcman, ENS Paris, 16 July 2021.
9 Scientific production
9.1 Major publications
 1 articleLatching dynamics in neural networks with synaptic depression.PLoS ONE128August 2017, e0183710
 2 articleSpatiotemporal canards in neural field equations.Physical Review E 954April 2017, 042205
 3 articleMeanfield description and propagation of chaos in networks of HodgkinHuxley neurons.The Journal of Mathematical Neuroscience21We derive the meanfield equations arising as the limit of a network of interacting spiking neurons, as the number of neurons goes to infinity. The neurons belong to a fixed number of populations and are represented either by the HodgkinHuxley model or by one of its simplified version, the FitzHughNagumo model. The synapses between neurons are either electrical or chemical. The network is assumed to be fully connected. The maximum conductances vary randomly. Under the condition that all neurons~ initial conditions are drawn independently from the same law that depends only on the population they belong to, we prove that a propagation of chaos phenomenon takes place, namely that in the meanfield limit, any finite number of neurons become independent and, within each population, have the same probability distribution. This probability distribution is a solution of a set of implicit equations, either nonlinear stochastic differential equations resembling the McKeanVlasov equations or nonlocal partial differential equations resembling the McKeanVlasovFokkerPlanck equations. We prove the wellposedness of the McKeanVlasov equations, i.e. the existence and uniqueness of a solution. We also show the results of some numerical experiments that indicate that the meanfield equations are a good representation of the mean activity of a finite size network, even for modest sizes. These experiments also indicate that the McKeanVlasovFokkerPlanck equations may be a good way to understand the meanfield dynamics through, e.g. a bifurcation analysis.2012, URL: http://www.mathematicalneuroscience.com/content/2/1/10
 4 articleA subRiemannian model of the visual cortex with frequency and phase.The Journal of Mathematical Neuroscience101December 2020
 5 articleLinks between deterministic and stochastic approaches for invasion in growthfragmentationdeath models.Journal of mathematical biology73672016, 17811821URL: https://hal.archivesouvertes.fr/hal01205467
 6 articleWeak convergence of a massstructured individualbased model.Applied Mathematics & Optimization7212015, 3773URL: https://hal.inria.fr/hal01090727
 7 articleAnalysis and approximation of a stochastic growth model with extinction.Methodology and Computing in Applied Probability1822016, 499515URL: https://hal.archivesouvertes.fr/hal01817824
 8 articleEffect of population size in a predatorprey model.Ecological Modelling2462012, 110URL: https://hal.inria.fr/hal00723793
 9 articleHeteroclinic cycles in Hopfield networks.Journal of Nonlinear ScienceJanuary 2016
 10 articleShortterm synaptic plasticity in the deterministic TsodyksMarkram model leads to unpredictable network dynamics.Proceedings of the National Academy of Sciences of the United States of America 110412013, 1661016615
 11 articleModeling cortical spreading depression induced by the hyperactivity of interneurons.Journal of Computational NeuroscienceOctober 2019
 12 articleCanards, folded nodes and mixedmode oscillations in piecewiselinear slowfast systems.SIAM Review584accepted for publication in SIAM Review on 13 August 2015November 2016, 653691
 13 articleMixedMode Bursting Oscillations: Dynamics created by a slow passage through spikeadding canard explosion in a squarewave burster.Chaos234October 2013, 046106
 14 articleHopf bifurcation in a nonlocal nonlinear transport equation stemming from stochastic neural dynamics.ChaosFebruary 2017
 15 articleNeuronal mechanisms for sequential activation of memory items: Dynamics and reliability.PLoS ONE1542020, 128
 16 unpublishedCanardinduced complex oscillations in an excitatory network.November 2018, working paper or preprint
 17 articleTimecoded neurotransmitter release at excitatory and inhibitory synapses.Proceedings of the National Academy of Sciences of the United States of America 1138February 2016, E1108E1115
 18 unpublishedA new twist for the simulation of hybrid systems using the true jump method.December 2015, working paper or preprint
 19 articleA Center Manifold Result for Delayed Neural Fields Equations.SIAM Journal on Mathematical Analysis4532013, 15271562
 20 articleA center manifold result for delayed neural fields equations.SIAM Journal on Applied Mathematics (under revision)RR8020July 2012
 21 articleInterplay Between Synaptic Delays and Propagation Delays in Neural Field Equations.SIAM Journal on Applied Dynamical Systems1232013, 15661612
9.2 Publications of the year
International journals
 22 articleCanonical models for torus canards in elliptic bursters.Chaos: An Interdisciplinary Journal of Nonlinear Science316June 2021, 063129
 23 articleA corticalinspired subRiemannian model for Poggendorfftype visual illusions.Journal of ImagingFebruary 2021
 24 articleCoherence resonance in neuronal populations: Meanfield versus network model.Physical Review E 1033March 2021
 25 articleInitiation of migrainerelated cortical spreading depolarization by hyperactivity of GABAergic neurons and NaV1.1 channels.Journal of Clinical Investigation13121September 2021, e142203
 26 articleSpikeadding and resetinduced canard cycles in adaptive integrate and fire models.Nonlinear DynamicsMay 2021
 27 articlePatientspecific network connectivity combined with a next generation neural mass model to test clinical hypothesis of seizure propagation.Frontiers in Systems Neuroscience15September 2021
 28 articleModeling NaV1.1/SCN1A sodium channel mutations in a microcircuit with realistic ion concentration dynamics suggests differential GABAergic mechanisms leading to hyperexcitability in epilepsy and hemiplegic migraine.PLoS Computational BiologyJuly 2021
Reports & preprints
 29 miscCrossscale excitability in networks of quadratic integrateandfire neurons.August 2021
 30 miscSpatial and color hallucinations in a mathematical model of primary visual cortex.October 2021
 31 miscDynamic branching in a neural network model for probabilistic prediction of sequences.January 2022
 32 miscThe one step fixedlag particle smoother as a strategy to improve the prediction step of particle filtering.December 2021
 33 miscA stochastic model of hippocampal synaptic plasticity with geometrical readout of enzyme dynamics.June 2021
 34 miscBursting in a next generation neural mass model with synaptic dynamics: a slowfast approach.January 2022
9.3 Cited publications
 35 articleHunting French ducks in a noisy environment.Journal of Differential Equations25292012, 47864841
 36 bookNoiseinduced phenomena in slowfast dynamical systems: a samplepaths approach.Springer Science & Business Media2006
 37 articleTheory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex.The Journal of Neuroscience211982, 3248
 38 articleHyperbolic planforms in relation to visual edges and textures perception.PLoS Computational Biology5122009, e1000625
 39 articleA role for fast rhythmic bursting neurons in cortical gamma oscillations in vitro.Proceedings of the National Academy of Sciences of the United States of America101182004, 71527157
 40 articleMixedMode Oscillations with Multiple Time Scales.SIAM Review542May 2012, 211288
 41 articleMixedMode Bursting Oscillations: Dynamics created by a slow passage through spikeadding canard explosion in a squarewave burster.Chaos234October 2013, 046106
 42 articleThe geometry of slow manifolds near a folded node.SIAM Journal on Applied Dynamical Systems742008, 11311162
 43 bookMathematical foundations of neuroscience.35Springer2010
 44 articleA large deviation principle and an expression of the rate function for a discrete stationary gaussian process.Entropy16122014, 67226738
 45 bookDynamical systems in neuroscience.MIT press2007
 46 articleNeural excitability, spiking and bursting.International Journal of Bifurcation and Chaos10062000, 11711266
 47 articleMixedmode oscillations in a three timescale model for the dopaminergic neuron.Chaos: An Interdisciplinary Journal of Nonlinear Science1812008, 015106
 48 articleRelaxation oscillation and canard explosion.Journal of Differential Equations17422001, 312368
 49 articleAmyloid precursor protein: from synaptic plasticity to Alzheimer's disease.Annals of the New York Academy of Sciences104812005, 149165
 50 articlePathophysiology of migraine.Annual review of physiology752013, 365391
 51 articleDynamical diseases of brain systems: different routes to epileptic seizures.IEEE transactions on biomedical engineering5052003, 540548
 52 articleA Markovian eventbased framework for stochastic spiking neural networks.Journal of Computational Neuroscience30April 2011
 53 articleNeural Mass Activity, Bifurcations, and Epilepsy.Neural Computation2312December 2011, 32323286
 54 articleA Center Manifold Result for Delayed Neural Fields Equations.SIAM Journal on Mathematical Analysis4532013, 1527562
 55 articleLocal/Global Analysis of the Stationary Solutions of Some Neural Field Equations.SIAM Journal on Applied Dynamical Systems93August 2010, 954998URL: http://arxiv.org/abs/0910.2247