Fig. 8. How connectivity influences network dynamics. A, The Lyapunov exponent X is plotted for each of T time steps for different values of the branching parameter O. The branching parameter governs the sum of transmission probabilities from each unit of the network. When O is close to the critical value of 1, dynamics is neutral, and X hovers around 0. As O is increased, dynamics becomes chaotic (X > 0); as O is decreased, dynamics becomes attractive (X < 0). B, The distribution of transmission probabilities emanating from each unit also influences dynamics. Three different types of units are shown, representing the three different types of exponential distributions that were examined. Thick arrows represent high transmission probabilities. The top unit shows transmission probabilities when the exponent B is low and the distribution is homogeneous. In this case, each unit acts to disperse trajectories, causing chaotic dynamics. The middle unit corresponds to intermediate values of B, where one to two transmission probabilities are strong. Here, each unit acts to focus trajectories, but with some dispersion, causing neutral dynamics. The lower unit illustrates the highly skewed distribution caused by large values of B. Here one connection dominates and all the rest are essentially zero. Units with high B distributions act to focus trajectories, leading to attractive dynamics. Figure 8 A is modified from Haldeman and Beggs, 2005, copyright American Physical Society, and is reproduced with permission by their approach, we here use an exponential distribution whose sharpness can be tuned through an exponent B, and we explore how B affects X. In these simulations, we use a network with 64 units that has 8 connections per unit. Qualitatively similar results obtain for networks with 64 connections per unit, suggesting that these findings are quite general. For small B (0 < B < 1.0), distributions are nearly flat and each connection has roughly the same probability of transmission. In this case, activity coming in to a unit will be spread widely and randomly to other connected units. This tends to disperse trajectories and leads to chaotic dynamics where X > 0. For intermediate values of B(1.2 < B < 1.8), one or two connections have transmission probabilities that are much larger than all the rest. Here activity coming in to a unit will tend to be transmitted to only one or two other units. This leads to propagation in which there is a balance of spreading and focus. While there is some variability in the paths that trajectories take, there is one path that is traveled most of the time. On average, dynamics tend to be neutral and X « 0. For large values of B(1.8 < B), one of the transmission probabilities is very near 1, while all of the others are near zero. So while a unit may receive convergent activation from two other units in the previous time step, it will almost always activate only one unit in the next time step. Under these conditions, units serve to bring different trajectories together, thus reducing distances over time and causing attractive dynamics with X < 0. Together, these simulations show that the distribution of connection strengths can also set the dynamics of a network (Fig. 8B).
How do changes in the number of connections affect dynamics? Although not directly in the field of neural networks, Stuart Kauffman has pursued this question in network models of gene regulatory networks. Since his studies are very likely to be relevant to our topic, we briefly mention them here. Kauff-man and colleagues (Kauffman S, 1969; Kauffman S et al., 2003; Kauffman SA and S Johnsen, 1991) examine networks where each binary unit can be either on (1) or off (0), and where each unit performs some Boolean function (e.g., AND, OR) on its inputs. Units are connected randomly, and the number of connections into each unit is determined by an order parameter K. Kauffman shows in these random Boolean networks that when K > 3, trajectories are very sensitive to small perturbations and dynamics is chaotic. When K = 2, however, trajectories are stable with respect to perturbations and the networks appear to operate at a critical point (Bornholdt S and T Rohlf, 2000). For K < 2, nearly all trajectories quickly fall into attractors. Kauffman and others (Gutowitz H and C Langton, 1995) have suggested that K governs a phase transition in these networks as it controls their dynamics. In some ways, high K networks may be similar to the neural network model described above when the distribution exponent B is small and all transmission probabilities are nearly equal. For intermediate values of B, one or two transmission probabilities are strong, and this may correspond to the critical case where K = 2 in Kauffman's networks. These possible connections are intriguing and deserve further exploration.
But why should dynamics matter? The dynamical regime of a network can strongly influence the types of computations it is able to perform (Vogels TP et al., 2005). Many models and experiments suggest that local networks support attractive dynamics (Amit Y and M Mascaro, 2001; Brunel N, 2000; Hopfield JJ, 1982; Jin DZ, 2002; Seung HS, 1998; Wills TJ et al., 2005). As mentioned earlier, strongly attractive dynamics is naturally good for setting up attractor states in which long-term memories can be stably stored. Such dynamics is also desirable for pattern completion, since a fragment of a stored pattern can be used as a cue to get the network into a state where it is near a basin of attraction and likely to evolve into the stored memory configuration. Moreover, attractive dynamics supports computations that favor categorization since they cause different stimuli to be grouped into the same response. For example, if a Wolfhound, a Chihuahua and a Beagle were all represented by positions in state space, attractive dynamics could cause trajectories from these points to all flow together, making it easy to set up the category of "dog." But the stability conferred by attractive dynamics also makes it difficult to steer trajectories away from strong attractors. Networks dominated by attractive dynamics would seem to lack flexibility.
In contrast, chaotic dynamics supports computations that favor discrimination since subtle differences in stimuli can produce widely different responses. Here too, there are a number of models and experiments that suggest that chaotic dynamics are prevalent in the brain (Aitken PG et al., 1995; Babloyantz A and A Destexhe, 1986; Breakspear M et al., 2003; Freeman WJ, 1994; Schiff SJ et al., 1994; van Vreeswijk C and H Sompolinsky, 1996). This dynamics could be useful in sensory systems where there is a great need to notice details of the incoming information stream. For example, whether a rabbit stays and eats or rapidly flees may be determined by only a few blades of grass in the visual field that seem to be moving in an unusual way. There have also been proposals that chaotic processing units could be used to perform logical or arithmetic computations since such units are naturally nonlinear (Sinha S and WL Ditto, 1999). However, networks with trajectories that rapidly diverge are unstable unless they are controlled.
With neutral dynamics, differences in inputs produce commensurate differences in responses. Not surprisingly, there are models and experiments that suggest this type of dynamics is used too (Beggs JM and D Plenz, 2003; Bertschinger N and T Natschlager, 2004; Haldeman C and JM Beggs, 2005; Latham PE and S Nirenberg, 2004; Maass W et al., 2002). This dynamics supports computations that favor efficient information transmission since a one-to-one mapping between stimuli and responses is maintained. They may also be optimal for information storage (Beggs JM and D Plenz, 2004; Haldeman C and JM Beggs, 2005). Several researchers have pointed out that neutral dynamics, "at the edge of chaos," may also be best for performing the widest variety of computations because it combines some of the variety of chaos with some of the stability of attractive systems (Bertschinger N and T Natschlager, 2004; Beggs 2007). It is argued that useful computations require both nonlinear transformations and stable representations of information. Perhaps neocortex, which is essential for higher-level computations, has largely neutral dynamics (Maass W et al., 2002; Natschlager T and W Maass, 2005).
To advance research in this area it will be necessary to form a tighter link between models and experiments. Many of the ideas about how connectivity influences dynamics described above have not yet been tested in living neural networks. Since nature often defies our expectations, it is essential that we develop better ways of interrogating networks of neurons. With advances in technology in the next ten years (Frechette ES et al., 2005), it may be possible to stimulate and record from thousands of neurons for periods of weeks at a time. The huge data sets that are likely to be produced will hopefully allow us to map the state space of living neural networks more closely.
It will also be important to investigate how different network topologies (e.g., random, small-world, scale-free) explicitly influence dynamics. The simulations described above treated all nodes in the network equiva-lently, but this is certainly a simplification. What happens when some nodes have different branching parameters and transmission probabilities than others? What if some nodes have more connections than others? These issues are only now beginning to be explored (Fox JJ and CC Hill, 2001), as the network topology of the brain at the local network level (Netoff TI et al., 2004; Song S et al., 2005) and at the large scale level (Achard S et al., 2006; Eguiluz VM et al., 2005; Sporns O et al., 2005; Sporns O and JD Zwi, 2004; Stam CJ et al., 2006) is still not well known. The connectivity patterns, and therefore the dynamics, at these different levels may not necessarily be the same (Breakspear M and CJ Stam, 2005; Jirsa VK, 2004).
Another area that deserves much attention is the relationship between dynamics and connectivity: How does brain activity, both acutely and chronically, alter the connectivity of neural networks? While activity-dependent synaptic plasticity has been extensively studied, most of this work has centered on how stimulation at one or a few synapses influences synaptic efficacy There is a need to expand the focus to explore how activity at the local network level may influence synaptic plasticity. In vivo, transmission at a single synapse is embedded in the context of rich background activity that is very influential (Leger JF et al., 2005). From this perspective, functional connectivity is very dynamic and may be different from the underlying structural connectivity (Sporns O et al., 2000). Since it has been shown that large-scale network connectivity can change from wakefulness to sleep (Massimini M et al., 2005), it seems likely that it would also change during transitions to other brain states as well, like seizures. Similar changes at the local network level should also be investigated. While it may be difficult to disentangle the contributions of connectivity and dynamics in these situations, their complexity suggests that these situations will be interesting and fruitful areas for further research.
In the previous sections we have shown how early models of memory storage in local recurrent networks led many to search for attractors in neurophys-iological data. While numerous examples of reproducible activity patterns in living neural networks have been found, very few experimental studies have addressed the dynamics of these networks quantitatively. By measuring the Lyapunov exponent in simple network models, it has become clear that network connectivity can profoundly influence dynamics. Experimental work in the future will hopefully begin to quantitatively address the dynamics of local cortical networks, perhaps even revealing how trajectories in cortical columns perform computations that form the building blocks of cognition.
Connectivity and Dynamics in Local Cortical Networks 111
This work was supported by the National Science Foundation and Indiana University.
Abarbanel HD, Rabinovich MI (2001) Neurodynamics: nonlinear dynamics and neurobiology. Current Opinion in Neurobiology 11: 423-430 Abarbanel HDI (1996) Analysis of observed chaotic data. New York: Springer Abeles M, Bergman H, Margalit E, Vaadia E (1993) Spatiotemporal firing patterns in the frontal cortex of behaving monkeys. J Neurophysiol 70: 1629-1638 Achard S, Salvador R, Whitcher B, Suckling J, Bullmore E (2006) A resilient, low-frequency, small-world human brain functional network with highly connected association cortical hubs. J Neurosci 26: 63-72 Aitken PG, Sauer T, Schiff SJ (1995) Looking for chaos in brain slices.
J Neurosci Methods 59: 41-48 Amit DJ (1989) Modeling brain function : the world of attractor neural networks.
Cambridge ; New York: Cambridge University Press Amit Y, Mascaro M (2001) Attractor networks for shape recognition. Neural Comput 13: 1415-1442
Anderson JA, Silverstein JW, Ritz SA, Jones RS (1977) Distinctive Features, Categorical Perception, and Probability Learning: Some Applications of a Neural Model. Psychological Review 84: 413-451 Arieli A, Sterkin A, Grinvald A, Aertsen A (1996) Dynamics of ongoing activity: Explanation of the large variability in evoked cortical responses. Science 273: 1868-1871
Babloyantz A, Destexhe A (1986) Low-dimensional chaos in an instance of epilepsy
Proc Natl Acad Sci U S A 83: 3513-3517 Bak P (1996) How nature works : the science of self-organized criticality. New York,
NY, USA: Copernicus Baker GL, Gollub JP (1996) Chaotic dynamics : an introduction. Cambridge;
New York: Cambridge University Press Baker SN, Lemon RN (2000) Precise spatiotemporal repeating patterns in monkey primary and supplementary motor areas occur at chance levels. J Neurophysiol 84: 1770-1780
Beggs JM, Plenz D (2003) Neuronal avalanches in neocortical circuits. J Neurosci 23: 11167-11177
Beggs JM, Plenz D (2004) Neuronal avalanches are diverse and precise activity patterns that are stable for many hours in cortical slice cultures. J Neurosci 24: 5216-5229
Beggs JM (2007) The criticality hypothesis: How local cortical networks might optimize information processing. Proceedings of the Philosophical Society A (submitted)
Ben-Shaul Y, Drori R, Asher I, Stark E, Nadasdy Z, Abeles M (2004) Neuronal activity in motor cortical areas reflects the sequential context of movement. J Neurophysiol 91: 1748-1762
Bertschinger N, Natschlager T (2004) Real-time computation at the edge of chaos in recurrent neural networks. Neural Comput 16: 1413-1436 Bornholdt S, Rohlf T (2000) Topological evolution of dynamical networks: global criticality from local dynamics. Phys Rev Lett 84: 6114-6117 Breakspear M, Stam CJ (2005) Dynamics of a neural system with a multiscale architecture. Philos Trans R Soc Lond B Biol Sci 360: 1051-1074 Breakspear M, Terry JR, Friston KJ (2003) Modulation of excitatory synaptic coupling facilitates synchronization and complex dynamics in a biophysical model of neuronal dynamics. Network 14: 703-732 Brown EN, Frank LM, Tang D, Quirk MC, Wilson MA (1998) A statistical paradigm for neural spike train decoding applied to position prediction from ensemble firing patterns of rat hippocampal place cells. J Neurosci 18: 7411-7425 Brunel N (2000) Dynamics of networks of randomly connected excitatory and inhibitory spiking neurons. J Physiol Paris 94: 445-463 Canli T, Desmond JE, Zhao Z, Gabrieli JD (2002) Sex differences in the neural basis of emotional memories. Proc Natl Acad Sci U S A 99: 10789-10794 Chi Z, Margoliash D (2001) Temporal precision and temporal drift in brain and behavior of zebra finch song. Neuron 32: 899-910 Cohen MA, Grossberg S (1983) Absolute stability of global pattern formation and parallel memory storage by competitive neural networks. IEEE Transactions on Systems, Man, and Cybernetics SMC-13: 815-826 Cossart R, Aronov D, Yuste R (2003) Attractor dynamics of network UP states in the neocortex. Nature 423: 283-288 Crutchfield JP, Farmer JD, Packard NH, Shaw RS (1986) Chaos. Sci Am 255: 46-&. Dave AS, Margoliash D (2000) Song replay during sleep and computational rules for sensorimotor vocal learning. Science 290: 812-816 Deregnaucourt S, Mitra PP, Feher O, Pytte C, Tchernichovski O (2005) How sleep affects the developmental learning of bird song. Nature 433: 710-716 Derrida B, Pomeau Y (1986) Random Networks of Automata - a Simple Annealed
Approximation. Europhys Lett 1: 45-49 Derrida B, Weisbuch G (1986) Evolution of Overlaps between Configurations in
Random Boolean Networks. J Phys-Paris 47: 1297-1303 Ding M, Yang W, In VV, Ditto WL, Spano ML, Gluckman B (1996) Controlling chaos in high dimensions: Theory and experiment. Physical Review E Statistical Physics, Plasmas, Fluids, and Related Interdisciplinary Topics 53: 4334-4344 Ditto WL, Showalter K (1997) Introduction: Control and synchronization of chaos. Chaos 7: 509-511
Eguiluz VM, Chialvo DR, Cecchi GA, Baliki M, Apkarian AV (2005) Scale-free brain functional networks. Phys Rev Lett 94: 018102 Fletcher P, Buchel C, Josephs O, Friston K, Dolan R (1999) Learning-related neuronal responses in prefrontal cortex studied with functional neuroimaging. Cereb Cortex 9: 168-178
Foss J, Longtin A, Mensour B and Milton JG (1996) Multistability and delayed recurrent loops. Phys. Rev. Lett. 76: 708-711 Foss J and Milton J (2000) Multistability in recurrent neural loops arising from delay. J. Neurophysiol. 84: 975-985 Fox JJ, Hill CC (2001) From topology to dynamics in biochemical networks. Chaos 11: 809-815
Frechette ES, Sher A, Grivich MI, Petrusca D, Litke AM, Chichilnisky EJ (2005) Fidelity of the ensemble code for visual motion in primate retina. J Neurophysiol 94: 119-135
Freeman WJ (1994) Neural networks and chaos. J Theor Biol 171: 13-18 Gutenberg B, Richter CF (1941) Seismicity of the earth. [New York]: The Society Gutowitz H, Langton C (1995) Mean field theory of the edge of chaos.
Advances in Artificial Life 929: 52-64 Hahnloser RH, Kozhevnikov AA, Fee MS (2002) An ultra-sparse code underlies the generation of neural sequences in a songbird. Nature 419: 65-70 Haldeman C, Beggs JM (2005) Critical branching captures activity in living neural networks and maximizes the number of metastable States. Phys Rev Lett 94: 058101
Harris TE (1989) The theory of branching processes. New York: Dover Publications Hebb DO (1949) The organization of behavior; a neuropsychological theory. New York,: Wiley
Hopfield JJ (1982) Neural networks and physical systems with emergent collective computational abilities. Proc Natl Acad Sci U S A 79: 2554-2558 Hopfield JJ (1984) Neurons with graded response have collective computational properties like those of two-state neurons. Proc Natl Acad Sci U S A 81: 3088-3092
Hopfield JJ, Tank DW (1986) Computing with neural circuits: a model. Science 233: 625-633
Huberman BA, Crutchfield JP, Packard NH (1980) Noise Phenomena in Josephson
Junctions. Appl Phys Lett 37: 750-752 Jantzen KJ, Steinberg FL, Kelso JA (2005) Functional MRI reveals the existence of modality and coordination-dependent timing networks. Neuroimage 25: 1031-1042
Jensen HJ (1998) Self-organized criticality : emergent complex behavior in physical and biological systems. Cambridge, U.K. ; New York: Cambridge University Press
Jin DZ (2002) Fast convergence of spike sequences to periodic patterns in recurrent networks. Phys Rev Lett 89: 208102 Jirsa VK (2004) Connectivity and dynamics of neural information processing.
Neuroinformatics 2: 183-204 Kantz H, Schreiber T (2004) Nonlinear time series analysis. Cambridge, UK ;
New York: Cambridge University Press Kauffman S (1969) Homeostasis and Differentiation in Random Genetic Control
Networks. Nature 224: 177 Kauffman S, Peterson C, Samuelsson B, Troein C (2003) Random Boolean network models and the yeast transcriptional network. P Natl Acad Sci USA 100: 14796-14799
Kauffman SA, Johnsen S (1991) Coevolution to the Edge of Chaos - Coupled Fitness Landscapes, Poised States, and Coevolutionary Avalanches. J Theor Biol 149: 467-505
Kelso SR, Ganong AH, Brown TH (1986) Hebbian synapses in hippocampus. Proc
Natl Acad Sci U S A 83: 5326-5330 Kenet T, Bibitchkov D, Tsodyks M, Grinvald A, Arieli A (2003) Spontaneously emerging cortical representations of visual attributes. Nature 425: 954-956
Kilgore MH, Turner JS, Mccormick WD, Swinney H (1981) Periodic and Chaotic Oscillations in the Belousovzhabotinskii Reaction. B Am Phys Soc 26: 362-362 Kirkwood A, Bear MF (1994) Hebbian synapses in visual cortex. J Neurosci 14: 1634-1645
Latham PE, Nirenberg S (2004) Computing and stability in cortical networks. Neural
Comput 16: 1385-1412 Lee AK, Wilson MA (2002) Memory of sequential experience in the hippocampus during slow wave sleep. Neuron 36: 1183-1194 Lee AK, Wilson MA (2004) A combinatorial method for analyzing sequential firing patterns involving an arbitrary number of neurons based on relative time order. J Neurophysiol 92: 2555-2573 Leger JF, Stern EA, Aertsen A, Heck D (2005) Synaptic integration in rat frontal cortex shaped by network activity. J Neurophysiol 93: 281-293 Lindsey BG, Morris KF, Shannon R, Gerstein GL (1997) Repeated patterns of distributed synchrony in neuronal assemblies. J Neurophysiol 78: 1714-1719 Louie K, Wilson MA (2001) Temporally structured replay of awake hippocampal ensemble activity during rapid eye movement sleep. Neuron 29: 145-156 Maass W, Natschlager T, Markram H (2002) Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Comput 14: 2531-2560 Malamud BD, Morein G, Turcotte DL (1998) Forest fires: An example of self-
organized critical behavior. Science 281: 1840-1842 Mao BQ, Hamzei-Sichani F, Aronov D, Froemke RC, Yuste R (2001) Dynamics of spontaneous activity in neocortical slices. Neuron 32: 883-898 Massimini M, Ferrarelli F, Huber R, Esser SK, Singh H, Tononi G (2005) Breakdown of cortical effective connectivity during sleep. Science 309: 2228-2232
Mechelli A, Price CJ, Friston KJ, Ishai A (2004) Where bottom-up meets top-down: neuronal interactions during perception and imagery. Cereb Cortex 14: 1256-1265
Nadasdy Z, Hirase H, Czurko A, Csicsvari J, Buzsaki G (1999) Replay and time compression of recurring spike sequences in the hippocampus. J Neurosci 19: 9497-9507
Natschlager T, Maass W (2005) Dynamics of information and emergent computation in generic neural microcircuit models. Neural Netw 18: 1301-1308 Netoff TI, Clewley R, Arno S, Keck T, White JA (2004) Epilepsy in small-world networks. J Neurosci 24: 8075-8083 Nicolis G, Prigogine I (1989) Exploring complexity : an introduction. New York: W.H. Freeman
Oram MW, Wiener MC, Lestienne R, Richmond BJ (1999) Stochastic nature of precisely timed spike patterns in visual system neuronal responses. J Neurophysiol 81: 3021-3033
Paczuski M, Maslov S, Bak P (1996) Avalanche dynamics in evolution, growth, and depinning models. Physical Review E Statistical Physics, Plasmas, Fluids, and Related Interdisciplinary Topics 53: 414-443 Rabinovich M, Volkovskii A, Lecanda P, Huerta R, Abarbanel HDI, Laurent G (2001) Dynamical encoding by networks of competing neuron groups: Winnerless competition. Physical Review Letters 8706
Ramon y Cajal S (1909) Histologie du systeme nerveux de l'homme & des vertebres. Paris,: Maloine
Rolls ET (1990) Theoretical and neurophysiological analysis of the functions of the primate hippocampus in memory. Cold Spring Harb Symp Quant Biol 55: 995-1006
Rowe J, Friston K, Frackowiak R, Passingham R (2002) Attention to action: specific modulation of corticocortical interactions in humans. Neuroimage 17: 988-998 Schiff SJ, Jerger K, Duong DH, Chang T, Spano ML, Ditto WL (1994) Controlling chaos in the brain. Nature 370: 615-620 Segev R, Baruchi I, Hulata E, Ben-Jacob E (2004) Hidden neuronal correlations in cultured networks. Phys Rev Lett 92: 118102 Seung HS (1998) Continuous attractors and oculomotor control. Neural Networks 11: 1253-1258
Sinha S, Ditto WL (1999) Computing with distributed chaos. Phys Rev E Stat Phys
Plasmas Fluids Relat Interdiscip Topics 60: 363-377 Skaggs WE, McNaughton BL (1992) Computational approaches to hippocampal function. Curr Opin Neurobiol 2: 209-211 Skaggs WE, McNaughton BL, Wilson MA, Barnes CA (1996) Theta phase precession in hippocampal neuronal populations and the compression of temporal sequences. Hippocampus 6: 149-172 Song S, Sjostrom PJ, Reigl M, Nelson S, Chklovskii DB (2005) Highly nonrandom features of synaptic connectivity in local cortical circuits. PLoS Biol 3: e68 Sporns O, Tononi G, Edelman GM (2000) Connectivity and complexity: the relationship between neuroanatomy and brain dynamics. Neural Netw 13: 909-922 Sporns O, Tononi G, Kotter R (2005) The human connectome: a structural description of the human brain. PLoS Comput Biol 1: e42 Sporns O, Zwi JD (2004) The small world of the cerebral cortex. Neuroinformatics 2: 145-162
Spors H, Grinvald A (2002) Spatio-temporal dynamics of odor representations in the mammalian olfactory bulb. Neuron 34: 301-315 Stam CJ, Jones BF, Nolte G, Breakspear M, Scheltens P (2006) Small-World Networks and Functional Connectivity in Alzheimer's Disease. Cereb Cortex Stanley HE (1987) Introduction to phase transitions and critical phenomena. New
York: Oxford University Press Steinbuch K (1961) Die lernmatrix. Kybernetik 1: 36-45
Steinbuch K, Frank H (1961) [Non-digital learning matrices as perceptors.].
Kybernetik 1: 117-124 Stephan KE, Marshall JC, Friston KJ, Rowe JB, Ritzl A, Zilles K, Fink GR (2003) Lateralized cognitive processes and lateralized task control in the human brain. Science 301: 384-386
Strogatz SH (1994) Nonlinear dynamics and Chaos : with applications to physics, biology, chemistry, and engineering. Reading, Mass.: Addison-Wesley Pub van Vreeswijk C, Sompolinsky H (1996) Chaos in neuronal networks with balanced excitatory and inhibitory activity. Science 274: 1724-1726 Vogels TP, Rajan K, Abbott LF (2005) Neural network dynamics. Annu Rev
Neurosci 28: 357-376 Wills TJ, Lever C, Cacucci F, Burgess N, O'Keefe J (2005) Attractor dynamics in the hippocampal representation of the local environment. Science 308: 873-876
Wilson MA (2002) Hippocampal memory formation, plasticity, and the role of sleep.
Neurobiol Learn Mem 78: 565-569 Wilson MA, McNaughton BL (1993) Dynamics of the hippocampal ensemble code for space. Science 261: 1055-1058 Wolf A, Swift JB, Swinney HL, Vastano JA (1985) Determining Lyapunov Exponents from a Time-Series. Physica D 16: 285-317 Yeomans JM (1992) Statistical mechanics of phase transitions. Oxford [England] New York: Clarendon Press ; Oxford University Press
Was this article helpful?