Hostname: page-component-78c5997874-xbtfd Total loading time: 0 Render date: 2024-11-08T17:32:17.916Z Has data issue: false hasContentIssue false

Large deviations for randomly connected neural networks: I. Spatially extended systems

Published online by Cambridge University Press:  16 November 2018

Tanguy Cabana
Affiliation:
Collège de France
Jonathan D. Touboul*
Affiliation:
Collège de France and Brandeis University
*
* Postal address: Department of Mathematics and Volen National Center for Complex Systems, Brandeis University, 415 South Street, Waltham, MA 02453, USA. Email address: [email protected]

Abstract

In a series of two papers, we investigate the large deviations and asymptotic behavior of stochastic models of brain neural networks with random interaction coefficients. In this first paper, we take into account the spatial structure of the brain and consider first the presence of interaction delays that depend on the distance between cells and then the Gaussian random interaction amplitude with a mean and variance that depend on the position of the neurons and scale as the inverse of the network size. We show that the empirical measure satisfies a large deviations principle with a good rate function reaching its minimum at a unique spatially extended probability measure. This result implies an averaged convergence of the empirical measure and a propagation of chaos. The limit is characterized through a complex non-Markovian implicit equation in which the network interaction term is replaced by a nonlocal Gaussian process with a mean and covariance that depend on the statistics of the solution over the whole neural field.

Type
Original Article
Copyright
Copyright © Applied Probability Trust 2018 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

[1]Amari, S.-I. (1988). Characteristics of random nets of analog neuron-like elements. In Artificial Neural Networks: Theoretical Concepts, IEEE, Los Alamitos, CA, pp. 5569.Google Scholar
[2]Ben Arous, G. and Guionnet, A. (1995). Large deviations for Langevin spin glass dynamics. Prob. Theory Relat. Fields 120, 455509. (Erratum: 103 (1995), 431.)Google Scholar
[3]Ben Arous, G. and Guionnet, A. (1998). Langevin dynamics for Sherrington–Kirkpatrick spin glasses. In Mathematical Aspects of Spin Glasses and Neural Networks, Birkhäuser, Boston, MA, pp. 323353.Google Scholar
[4]Ben Arous, G., Dembo, A. and Guionnet, A. (2001). Aging of spherical spin glasses. Prob. Theory Relat. Fields 120, 167.Google Scholar
[5]Bressloff, P. C. (2010). Stochastic neural field theory and the system-size expansion. SIAM J. Appl. Math. 70, 14881521.Google Scholar
[6]Bressloff, P. C. (2012). Spatiotemporal dynamics of continuum neural fields. J. Phys. A 45, 033001.Google Scholar
[7]Bressloff, P. C. et al. (2001). Geometric visual hallucinations, Euclidean symmetry and the functional architecture of striate cortex. Phil. Trans. R. Soc. B 356, 299330.Google Scholar
[8]Buice, M. A. and Chow, C. C. (2013). Beyond mean field theory: statistical field theory for neural networks. J. Statist. Mech. Theory Exp. 2013, P03003.Google Scholar
[9]Buice, M. A. and Chow, C. C. (2013). Dynamic finite size effects in spiking neural networks. PLoS Comput. Biol. 9, e1002872.Google Scholar
[10]Buice, M. A. and Cowan, J. D. (2007). Field-theoretic approach to fluctuation effects in neural networks. Phys. Rev. E 75, 051919.Google Scholar
[11]Buice, M. A., Cowan, J. D. and Chow, C. C. (2010). Systematic fluctuation expansion for neural network activity equations. Neural Comput. 22, 377426.Google Scholar
[12]Cabana, T. and Touboul, J. (2013). Large deviations, dynamics and phase transitions in large stochastic and disordered neural networks. J. Statist. Phys. 153, 211269.Google Scholar
[13]Cabana, T. and Touboul, J. (2018). Large deviations for randomly connected neural networks: II. State-dependent interactions. Adv. Appl. Prob. 50, 9831004.Google Scholar
[14]Cai, D., Tao, L., Shelley, M. and McLaughlin, D. W. (2004). An effective kinetic representation of fluctuation-driven neuronal networks with application to simple and complex cells in visual cortex. Proc. Nat. Acad. Sci. 101, 77577762.Google Scholar
[15]Cessac, B. and Samuelides, M. (2007). From neuron to neural networks dynamics. Europ. Phys. J. Special Topics 142, 788.Google Scholar
[16]Da Prato, G. and Zabczyk, J. (1992). Stochastic Equations in Infinite Dimensions. Cambridge University Press.Google Scholar
[17]Dai Pra, P. and den Hollander, F. (1996). McKean–Vlasov limit for interacting random processes in random media. J. Statist. Phys. 84, 735772.Google Scholar
[18]Daido, H. (1992). Quasientrainment and slow relaxation in a population of oscillators with random and frustrated interactions. Phys. Rev. Lett. 68, 10731078.Google Scholar
[19]Daido, H. (2000). Algebraic relaxation of an order parameter in randomly coupled limit-cycle oscillators. Phys. Rev. E 61, 21452147.Google Scholar
[20]Dauce, E., Moynot, O., Pinaud, O. and Samuelides, M. (2001). Mean-field theory and synchronization in random recurrent neural networks. Neural Process. Lett. 14, 115126.Google Scholar
[21]Dembo, A. and Zeitouni, O. (2010). Large Deviations Techniques and Applications, 2nd edn. Springer, Berlin.Google Scholar
[22]Den Hollander, F. (2000). Large Deviations. American Mathematical Society, Providence, RI.Google Scholar
[23]Deuschel, J.-D. and Stroock, D. W. (1989). Large Deviations. Academic Press, Boston, MA.Google Scholar
[24]Ermentrout, G. B. and Cowan, J. D. (1979). A mathematical theory of visual hallucination patterns. Biol. Cybernet. 34, 137150.Google Scholar
[25]Faugeras, O. and Maclaurin, J. (2014). Asymptotic description of stochastic neural networks. I. Existence of a large deviation principle. C. R. Math. Acad. Sci. Paris 352, 841846.Google Scholar
[26]Faugeras, O., Touboul, J. and Cessac, B. (2009). A constructive mean-field analysis of multi-population neural networks with random synaptic weights and stochastic inputs. Front. Comput. Neurosci. 3, 10.3389/neuro.10.001.2009.Google Scholar
[27]Funahashi, S., Bruce, C. J. and Goldman-Rakic, P. S. (1989). Mnemonic coding of visual space in the monkey's dorsolateral prefrontal cortex. J. Neurophysiol. 61, 331349.Google Scholar
[28]Guionnet, A. (1997). Averaged and quenched propagation of chaos for spin glass dynamics. Prob. Theory Relat. Fields 109, 183215.Google Scholar
[29]Hermann, G. and Touboul, J. (2012). Heterogeneous connections induce oscillations in large-scale networks. Phys. Rev. Lett. 109, 018702.Google Scholar
[30]Jancke, D., Chavane, F., Naaman, S. and Grinvald, A. (2004). Imaging cortical correlates of illusion in early visual cortex. Nature 428, 423426.Google Scholar
[31]Kilpatrick, Z. P. (2013). Interareal coupling reduces encoding variability in multi-area models of spatial working memory. Frontiers Comput. Neurosci. 7, 82.Google Scholar
[32]Luçon, E. and Stannat, W. (2014). Mean field limit for disordered diffusions with singular interactions. Ann. Appl. Prob. 24, 19461993.Google Scholar
[33]Ly, C. and Tranchina, D. (2007). Critical analysis of dimension reduction by a moment closure method in a population density approach to neural network modeling. Neural Comput. 19, 20322092.Google Scholar
[34]Mao, X. (2008). Stochastic Differential Equations and Applications, 2nd edn. Horwood, Chichester.Google Scholar
[35]Moynot, O. and Samuelides, M. (2002). Large deviations and mean-field theory for asymmetric random recurrent neural networks. Prob. Theory Relat. Fields 123, 4175.Google Scholar
[36]Muller, L., Reynaud, A., Chavane, F. and Destexhe, A. (2014). The stimulus-evoked population response in visual cortex of awake monkey is a propagating wave. Nature Commun. 5, 3675.Google Scholar
[37]Neveu, J. (1968). Processus aléatoires gaussiens. Presses de l'Université de Montréal.Google Scholar
[38]Rangan, A. V., Cai, D. and Tao, L. (2007). Numerical methods for solving moment equations in kinetic theory of neuronal network dynamics. J. Comput. Phys. 221, 781798.Google Scholar
[39]Rangan, A. V., Kovačič, G. and Cai, D. (2008). Kinetic theory for neuronal networks with fast and slow excitatory conductances driven by the same spike train. Phys. Rev. E 77, 041915.Google Scholar
[40]Revuz, D. and Yor, M. (1999). Continuous Martingales and Brownian Motion, 3rd edn. Springer, Berlin.Google Scholar
[41]Sherrington, D. and Kirkpatrick, S. (1975). Solvable model of a spin-glass. Phys. Rev. Lett. 35, 17921796.Google Scholar
[42]Sompolinsky, H., Crisanti, A. and Sommers, H. J. (1988). Chaos in random neural networks. Phys. Rev. Lett. 61, 259262.Google Scholar
[43]Stiller, J. C. and Radons, G. (1998). Dynamics of nonlinear oscillators with random interactions. Phys. Rev. E 58, 17891799.Google Scholar
[44]Stiller, J. C. and Radons, G. (2000). Self-averaging of an order parameter in randomly coupled limit-cycle oscillators. Phys. Rev. E 61, 21482149.Google Scholar
[45]Sznitman, A.-S. (1984). Équations de type de Boltzmann, spatialement homogènes. Z. Wahrscheinlichkeitsth. 66, 559592.Google Scholar
[46]Sznitman, A.-S. (1991). Topics in propagation of chaos. In École d'Été de Probabilités de Saint-Flour XIX—1989, Springer, Berlin, pp. 165251.Google Scholar
[47]Touboul, J. (2014). Spatially extended networks with singular multi-scale connectivity patterns. J. Stat. Phys. 156, 546573.Google Scholar
[48]Touboul, J. (2014). Propagation of chaos in neural fields. Ann. Appl. Prob. 24, 12981328.Google Scholar
[49]Wilson, H. R. and Cowan, J. D. (1972). Excitatory and inhibitory interactions in localized populations of model neurons. Biophys. J. 12, 124.Google Scholar
[50]Wilson, H. R. and Cowan, J. D. (1973). A mathematical theory of the functional dynamics of cortical and thalamic nervous tissue. Biol. Cybernet. 13, 5580.Google Scholar