Lorenz, Gabriel Matias
(2025)
Multivariate information theoretic methods for the analysis of neural network function, [Dissertation thesis], Alma Mater Studiorum Università di Bologna.
Dottorato di ricerca in
Data science and computation, 36 Ciclo. DOI 10.48676/unibo/amsdottorato/12022.
Documenti full-text disponibili:
Abstract
To understand brain functions, it is necessary to characterize how neural systems encode, process, and transmit information. Information theory provides multivariate analysis tools to address these questions by analyzing activity recordings from real brains or neural network models. These tools are model-independent and can be applied to any recording modality and across different scales. In this thesis, we improved and used information-theoretic tools to study the brain in several ways.
We developed the Multivariate Information in Neuroscience Toolbox (MINT), designed for analyzing neural information. MINT includes tools such as Shannon entropy, mutual information, transfer entropy, and Partial Information Decomposition (PID). It enables researchers to quantify how neural populations encode and transmit behaviorally relevant information across brain regions, enhancing investigations into neural computation. By integrating dimensionality reduction techniques and bias-correction methods, MINT allows precise analysis of high-dimensional neural datasets.
A significant limitation in computing PID components from neural data is sampling bias, particularly in synergy, which increases quadratically with the number of possible neural responses, leading to overestimations. To address this, we developed bias-correction methods that enhance PID estimation accuracy. We applied these methods to data from the auditory cortex, posterior parietal cortex, and hippocampus of mice engaged in cognitive tasks, deriving accurate estimates of how synergy and redundancy vary across regions.
Additionally, we used MINT to analyze simulated spiking neural network models to explore contributions of different cortical interneurons to information encoding. Previous models with a single interneuron type revealed redundant encoding in the gamma frequency range. In contrast, our extended models showed distinct gamma frequencies carry synergistic information about sensory inputs, suggesting interneuron diversity enhances information encoding. Together, our methodological work and network model findings highlight the potential of information theory for advancing understanding of neural encoding and information transmission.
Abstract
To understand brain functions, it is necessary to characterize how neural systems encode, process, and transmit information. Information theory provides multivariate analysis tools to address these questions by analyzing activity recordings from real brains or neural network models. These tools are model-independent and can be applied to any recording modality and across different scales. In this thesis, we improved and used information-theoretic tools to study the brain in several ways.
We developed the Multivariate Information in Neuroscience Toolbox (MINT), designed for analyzing neural information. MINT includes tools such as Shannon entropy, mutual information, transfer entropy, and Partial Information Decomposition (PID). It enables researchers to quantify how neural populations encode and transmit behaviorally relevant information across brain regions, enhancing investigations into neural computation. By integrating dimensionality reduction techniques and bias-correction methods, MINT allows precise analysis of high-dimensional neural datasets.
A significant limitation in computing PID components from neural data is sampling bias, particularly in synergy, which increases quadratically with the number of possible neural responses, leading to overestimations. To address this, we developed bias-correction methods that enhance PID estimation accuracy. We applied these methods to data from the auditory cortex, posterior parietal cortex, and hippocampus of mice engaged in cognitive tasks, deriving accurate estimates of how synergy and redundancy vary across regions.
Additionally, we used MINT to analyze simulated spiking neural network models to explore contributions of different cortical interneurons to information encoding. Previous models with a single interneuron type revealed redundant encoding in the gamma frequency range. In contrast, our extended models showed distinct gamma frequencies carry synergistic information about sensory inputs, suggesting interneuron diversity enhances information encoding. Together, our methodological work and network model findings highlight the potential of information theory for advancing understanding of neural encoding and information transmission.
Tipologia del documento
Tesi di dottorato
Autore
Lorenz, Gabriel Matias
Supervisore
Co-supervisore
Dottorato di ricerca
Ciclo
36
Coordinatore
Settore disciplinare
Settore concorsuale
Parole chiave
information theory; neural coding; neural information processing; partial information decomposition; neural population coding; noise correlations
DOI
10.48676/unibo/amsdottorato/12022
Data di discussione
26 Marzo 2025
URI
Altri metadati
Tipologia del documento
Tesi di dottorato
Autore
Lorenz, Gabriel Matias
Supervisore
Co-supervisore
Dottorato di ricerca
Ciclo
36
Coordinatore
Settore disciplinare
Settore concorsuale
Parole chiave
information theory; neural coding; neural information processing; partial information decomposition; neural population coding; noise correlations
DOI
10.48676/unibo/amsdottorato/12022
Data di discussione
26 Marzo 2025
URI
Statistica sui download
Gestione del documento: