Ticchi, Alessandro
(2016)
Bayesian Computations in Noisy Spiking Neurons, [Dissertation thesis], Alma Mater Studiorum Università di Bologna.
Dottorato di ricerca in
Fisica, 27 Ciclo. DOI 10.6092/unibo/amsdottorato/7267.
Documenti full-text disponibili:
Abstract
The world is stochastic and chaotic, and organisms have access to limited information to take decisions. For this reason, brains are continuously required to deal with probability distributions, and experimental evidence confirms that they are dealing with these distributions optimally or close to optimally, according to the rules of Bayesian probability theory. Yet, a complete understanding of how these computations are implemented at the neural level is still missing. We assume that the “computational” goal of neurons is to perform Bayesian inference and to represent the state of the world efficiently. Starting from this assumption, we derive from first principles two distinct models of neural functioning, one in single neuron and one in neural populations, which explain known biophysics and molecular processes of neurons.
The models we propose suggest a new original interpretation for various neural quantities. Action potentials, which are usually considered the paramount form of communication between neurons, in our model of single neuron dynamics are reinterpreted as an internal communication channel. On the contrary, intracellular calcium concentration is interpreted as the most explicit representation of the external world inside the neuron. Specifically, we propose that calcium level represents the log-odds probability ratio of a particular hidden state in the world. Furthermore, we reinterpret synaptic vesicle release as a sampling process, which simulates the external world given all the available information. Finally, the neural population dynamics we propose interpret spontaneous neural activity as a process of sampling from the prior world statistics. This enables the system to implement a Markov Chain Monte Carlo algorithm that produces inference by sampling.
The proposed models generate various observable predictions, which match experimental results about synaptic vesicle release, short-term synaptic potentiation, ions channels open probability, intracellular calcium dynamics and propagation, spike rate adaptation and neural receptive fields.
Abstract
The world is stochastic and chaotic, and organisms have access to limited information to take decisions. For this reason, brains are continuously required to deal with probability distributions, and experimental evidence confirms that they are dealing with these distributions optimally or close to optimally, according to the rules of Bayesian probability theory. Yet, a complete understanding of how these computations are implemented at the neural level is still missing. We assume that the “computational” goal of neurons is to perform Bayesian inference and to represent the state of the world efficiently. Starting from this assumption, we derive from first principles two distinct models of neural functioning, one in single neuron and one in neural populations, which explain known biophysics and molecular processes of neurons.
The models we propose suggest a new original interpretation for various neural quantities. Action potentials, which are usually considered the paramount form of communication between neurons, in our model of single neuron dynamics are reinterpreted as an internal communication channel. On the contrary, intracellular calcium concentration is interpreted as the most explicit representation of the external world inside the neuron. Specifically, we propose that calcium level represents the log-odds probability ratio of a particular hidden state in the world. Furthermore, we reinterpret synaptic vesicle release as a sampling process, which simulates the external world given all the available information. Finally, the neural population dynamics we propose interpret spontaneous neural activity as a process of sampling from the prior world statistics. This enables the system to implement a Markov Chain Monte Carlo algorithm that produces inference by sampling.
The proposed models generate various observable predictions, which match experimental results about synaptic vesicle release, short-term synaptic potentiation, ions channels open probability, intracellular calcium dynamics and propagation, spike rate adaptation and neural receptive fields.
Tipologia del documento
Tesi di dottorato
Autore
Ticchi, Alessandro
Supervisore
Dottorato di ricerca
Scuola di dottorato
Scienze matematiche, fisiche ed astronomiche
Ciclo
27
Coordinatore
Settore disciplinare
Settore concorsuale
Parole chiave
Spiking Neurons, Calcium, Bayesian Inference, Noise, Adaptation, Brain, Sampling, Markov Chain Monte Carlo, Neural dynamics, Computational Neuroscience, Efficient Coding, Predictive Coding, Neural Population, Synaptic Communication, Theoretical Neuroscience, Learning, Plasticity Rule
URN:NBN
DOI
10.6092/unibo/amsdottorato/7267
Data di discussione
6 Aprile 2016
URI
Altri metadati
Tipologia del documento
Tesi di dottorato
Autore
Ticchi, Alessandro
Supervisore
Dottorato di ricerca
Scuola di dottorato
Scienze matematiche, fisiche ed astronomiche
Ciclo
27
Coordinatore
Settore disciplinare
Settore concorsuale
Parole chiave
Spiking Neurons, Calcium, Bayesian Inference, Noise, Adaptation, Brain, Sampling, Markov Chain Monte Carlo, Neural dynamics, Computational Neuroscience, Efficient Coding, Predictive Coding, Neural Population, Synaptic Communication, Theoretical Neuroscience, Learning, Plasticity Rule
URN:NBN
DOI
10.6092/unibo/amsdottorato/7267
Data di discussione
6 Aprile 2016
URI
Statistica sui download
Gestione del documento: