ETD system

Electronic theses and dissertations repository


Tesi etd-07032016-181443

Thesis type
Tesi di laurea specialistica
Memory retrieval in balanced neural networks with dynamical synapses
Corso di studi
relatore Prof. Rossi, Paolo
relatore Dott. Mongillo, Gianluigi
Parole chiave
  • attractor
  • memory
  • neural nteworks
  • short-term plasticity
  • synaptic dinamics
Data inizio appello
Riassunto analitico
Neuronal recordings from animals performing memory tasks have revealed<br>a phenomenon known as selective persistent activity: the presentation of<br>stimuli to be remembered increase the level of activity of selective (i.e., depending<br>on the specific stimulus) neuronal populations which then persist<br>long after stimulus offset. Such persistent activity is considered to be a major<br>neuronal correlate of short-term memory.<br>A time-honored theoretical account of persistent activity is the attrac-<br>tor hypothesis, originated from the first studies of spin-glass inspired neural<br>network models [1]. According to this hypothesis, the neurons within the<br>selective populations have strong recurrent excitatory couplings. The resulting<br>positive feedback, together with the the nonlinearity of the single-cell<br>response function, allows such populations to have two stable states of activity:<br>one corresponding to the spontaneous activity, the other (at higher<br>rate) corresponding to the mnemonic retention of the stimulus.<br>Since the neuron response function is typically S-shaped, and the recurrent<br>input is a linear function of the activity, the stable states generically<br>occur outside the dynamic range of the neuron function, that is near extremely<br>low or high activity levels. This constitutes a major inconsistency of<br>the model, as experimental data show that the activity level at which cortical<br>neurons operate is much lower than saturation.<br>It is well established that synapses display activity-dependent modulations<br>of their efficacy, like short-term depression(STD): whenever a neuron<br>stays active, the intensity of the signals transmitted through its synapses is<br>gradually reduced[2]. In the thesis we examined the possibility to obtain<br>bistability far from saturation by making the recurrent excitatory inputs a<br>non-linear function of the activity through STD.<br>The study have been carried out within the framework of balanced networks<br>[3] as it captures essential features of cortical networks activity while remaining<br>analytically tractable.<br>The first chapter, after a short summary of the relevant neurophysiological<br>background, is dedicated to review the standard balanced network model<br>of binary neurons and its general properties.<br>In the second chapter, it is illustrated a possible extension of balanced<br>networks to the attractor framework: reinforcing suitably the excitatory couplings<br>between the neurons of certain populations, the system can have multiple<br>stable attractors corresponding to memory states. This allows us to<br>illustrate the issue of unrealistically high level of activity in memory states.<br>The remaining chapters contain the original part of the work.<br>In the third chapter, the phenomenological model which mimics STD is<br>introduced. As a first step, it is considered a balanced network endowed<br>with STD in absence of memory-supporting reinforcement of the couplings.<br>A mean-field description have been derived to characterize the stationary<br>states of the network; this has been done by neglecting the time delayed<br>autocorrelations of the neuron’s activities, which have been approximated as<br>Markov process to obtain a system of equations in closed form. Numerical<br>simulations of the network show that, despite the approximation, the meanfield<br>theory gives an excellent quantitative prediction for the system’s order<br>parameters.<br>The content of the fourth chapter is the implementation of the STD<br>synaptic dynamics into the memory model introduced in chapter two and<br>the extension of the mean-field theory to the scenario with multiple memory<br>states. The theoretical analysis of the model shows that it’s indeed possible<br>to produce stable states with biologically plausible levels of activity, far from<br>saturation, and network simulations confirm the result. Moreover, the network<br>operates in a regime such that temporal fluctuations and spatial inhomogeneities<br>of the neuron’s activity are generated by the dynamics (without<br>the addition of any source of external noise), reproducing the experimentally<br>observed statistics of neural activity.