Limits...
Reconstruction of recurrent synaptic connectivity of thousands of neurons from simulated spiking activity.

Zaytsev YV, Morrison A, Deger M - J Comput Neurosci (2015)

Bottom Line: We describe a minimal model that is optimized for large networks and an efficient scheme for its parallelized numerical optimization on generic computing clusters.Finally, we successfully reconstruct the connectivity of a hidden synfire chain that is embedded in a random network, which requires clustering of the network connectivity to reveal the synfire groups.Our results demonstrate how synaptic connectivity could potentially be inferred from large-scale parallel spike train recordings.

View Article: PubMed Central - PubMed

Affiliation: Simulation Laboratory Neuroscience - Bernstein Facility for Simulation and Database Technology, Institute for Advanced Simulation, Jülich Aachen Research Alliance, Jülich Research Center, Jülich, Germany, yury@zaytsev.net.

ABSTRACT
Dynamics and function of neuronal networks are determined by their synaptic connectivity. Current experimental methods to analyze synaptic network structure on the cellular level, however, cover only small fractions of functional neuronal circuits, typically without a simultaneous record of neuronal spiking activity. Here we present a method for the reconstruction of large recurrent neuronal networks from thousands of parallel spike train recordings. We employ maximum likelihood estimation of a generalized linear model of the spiking activity in continuous time. For this model the point process likelihood is concave, such that a global optimum of the parameters can be obtained by gradient ascent. Previous methods, including those of the same class, did not allow recurrent networks of that order of magnitude to be reconstructed due to prohibitive computational cost and numerical instabilities. We describe a minimal model that is optimized for large networks and an efficient scheme for its parallelized numerical optimization on generic computing clusters. For a simulated balanced random network of 1000 neurons, synaptic connectivity is recovered with a misclassification error rate of less than 1 % under ideal conditions. We show that the error rate remains low in a series of example cases under progressively less ideal conditions. Finally, we successfully reconstruct the connectivity of a hidden synfire chain that is embedded in a random network, which requires clustering of the network connectivity to reveal the synfire groups. Our results demonstrate how synaptic connectivity could potentially be inferred from large-scale parallel spike train recordings.

No MeSH data available.


Related in: MedlinePlus

Cross-validation for the ℓ1 regularization coefficient α for fixed values of τ=10 ms and d=1.7 ms. Each panel shows Ns=75 parameter scans for Ne=60 excitatory and Ni=15 inhibitory neurons randomly selected from the complete recording of N=1000 neurons. a The values of the rescaled log-likelihood  as a function of the ℓ1 regularization coefficient α computed on the training dataset. b Log-likelihood  as a function of α computed on the validation dataset using the parameters estimated from the training dataset. The red star marks the average of the horizontal location of the peaks of all curves in the plot
© Copyright Policy
Related In: Results  -  Collection


getmorefigures.php?uid=PMC4493949&req=5

Fig7: Cross-validation for the ℓ1 regularization coefficient α for fixed values of τ=10 ms and d=1.7 ms. Each panel shows Ns=75 parameter scans for Ne=60 excitatory and Ni=15 inhibitory neurons randomly selected from the complete recording of N=1000 neurons. a The values of the rescaled log-likelihood as a function of the ℓ1 regularization coefficient α computed on the training dataset. b Log-likelihood as a function of α computed on the validation dataset using the parameters estimated from the training dataset. The red star marks the average of the horizontal location of the peaks of all curves in the plot

Mentions: The results of the reconstruction for a subset of the recorded neurons with different values of α on the training dataset are shown in the left panel of Fig. 7. The right panel depicts the subsequent evaluation of the log-likelihood function on the validation dataset. It is important to note that, for optimal results, this procedure should generally be performed for all neurons, and an individual regularization coefficient should be selected for each of the cells. Instead, in order to save computational resources, we only performed it for a subpopulation of neurons and subsequently selected the same value of α=10 for all cells, which is slightly lower than the average, to prevent excessive connection pruning in neurons with small optimal α.Fig. 7


Reconstruction of recurrent synaptic connectivity of thousands of neurons from simulated spiking activity.

Zaytsev YV, Morrison A, Deger M - J Comput Neurosci (2015)

Cross-validation for the ℓ1 regularization coefficient α for fixed values of τ=10 ms and d=1.7 ms. Each panel shows Ns=75 parameter scans for Ne=60 excitatory and Ni=15 inhibitory neurons randomly selected from the complete recording of N=1000 neurons. a The values of the rescaled log-likelihood  as a function of the ℓ1 regularization coefficient α computed on the training dataset. b Log-likelihood  as a function of α computed on the validation dataset using the parameters estimated from the training dataset. The red star marks the average of the horizontal location of the peaks of all curves in the plot
© Copyright Policy
Related In: Results  -  Collection

Show All Figures
getmorefigures.php?uid=PMC4493949&req=5

Fig7: Cross-validation for the ℓ1 regularization coefficient α for fixed values of τ=10 ms and d=1.7 ms. Each panel shows Ns=75 parameter scans for Ne=60 excitatory and Ni=15 inhibitory neurons randomly selected from the complete recording of N=1000 neurons. a The values of the rescaled log-likelihood as a function of the ℓ1 regularization coefficient α computed on the training dataset. b Log-likelihood as a function of α computed on the validation dataset using the parameters estimated from the training dataset. The red star marks the average of the horizontal location of the peaks of all curves in the plot
Mentions: The results of the reconstruction for a subset of the recorded neurons with different values of α on the training dataset are shown in the left panel of Fig. 7. The right panel depicts the subsequent evaluation of the log-likelihood function on the validation dataset. It is important to note that, for optimal results, this procedure should generally be performed for all neurons, and an individual regularization coefficient should be selected for each of the cells. Instead, in order to save computational resources, we only performed it for a subpopulation of neurons and subsequently selected the same value of α=10 for all cells, which is slightly lower than the average, to prevent excessive connection pruning in neurons with small optimal α.Fig. 7

Bottom Line: We describe a minimal model that is optimized for large networks and an efficient scheme for its parallelized numerical optimization on generic computing clusters.Finally, we successfully reconstruct the connectivity of a hidden synfire chain that is embedded in a random network, which requires clustering of the network connectivity to reveal the synfire groups.Our results demonstrate how synaptic connectivity could potentially be inferred from large-scale parallel spike train recordings.

View Article: PubMed Central - PubMed

Affiliation: Simulation Laboratory Neuroscience - Bernstein Facility for Simulation and Database Technology, Institute for Advanced Simulation, Jülich Aachen Research Alliance, Jülich Research Center, Jülich, Germany, yury@zaytsev.net.

ABSTRACT
Dynamics and function of neuronal networks are determined by their synaptic connectivity. Current experimental methods to analyze synaptic network structure on the cellular level, however, cover only small fractions of functional neuronal circuits, typically without a simultaneous record of neuronal spiking activity. Here we present a method for the reconstruction of large recurrent neuronal networks from thousands of parallel spike train recordings. We employ maximum likelihood estimation of a generalized linear model of the spiking activity in continuous time. For this model the point process likelihood is concave, such that a global optimum of the parameters can be obtained by gradient ascent. Previous methods, including those of the same class, did not allow recurrent networks of that order of magnitude to be reconstructed due to prohibitive computational cost and numerical instabilities. We describe a minimal model that is optimized for large networks and an efficient scheme for its parallelized numerical optimization on generic computing clusters. For a simulated balanced random network of 1000 neurons, synaptic connectivity is recovered with a misclassification error rate of less than 1 % under ideal conditions. We show that the error rate remains low in a series of example cases under progressively less ideal conditions. Finally, we successfully reconstruct the connectivity of a hidden synfire chain that is embedded in a random network, which requires clustering of the network connectivity to reveal the synfire groups. Our results demonstrate how synaptic connectivity could potentially be inferred from large-scale parallel spike train recordings.

No MeSH data available.


Related in: MedlinePlus