Limits...
Degree Correlations Optimize Neuronal Network Sensitivity to Sub-Threshold Stimuli.

Schmeltzer C, Kihara AH, Sokolov IM, Rüdiger S - PLoS ONE (2015)

Bottom Line: We devise a method to calculate firing rates from a self-consistent system of equations taking into account the degree distribution and degree correlations in the network.We show that assortative degree correlations strongly improve the sensitivity for weak stimuli and propose that such networks possess an advantage in signal processing.We moreover find that there exists an optimum in assortativity at an intermediate level leading to a maximum in input/output mutual information.

View Article: PubMed Central - PubMed

Affiliation: Institut für Physik, Humboldt-Universität zu Berlin, Germany.

ABSTRACT
Information processing in the brain crucially depends on the topology of the neuronal connections. We investigate how the topology influences the response of a population of leaky integrate-and-fire neurons to a stimulus. We devise a method to calculate firing rates from a self-consistent system of equations taking into account the degree distribution and degree correlations in the network. We show that assortative degree correlations strongly improve the sensitivity for weak stimuli and propose that such networks possess an advantage in signal processing. We moreover find that there exists an optimum in assortativity at an intermediate level leading to a maximum in input/output mutual information.

No MeSH data available.


Schematic of a link swap that increases assortative in-degree correlations in the network without changing the in- and out-degrees of the nodes.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4482728&req=5

pone.0121794.g010: Schematic of a link swap that increases assortative in-degree correlations in the network without changing the in- and out-degrees of the nodes.

Mentions: For construction of the model network we employ the configuration model of Newman, Strogatz, and Watts [46], which creates random networks with the desired in- and out-degree distribution. In short, the algorithm works as follows: Each neuron of the network is assigned a target in- and out-degree, drawn from the desired degree distribution. The target-degrees of a neuron can be regarded as a number ingoing and outgoing stubs. The algorithm successively connects randomly selected in- and out-stubs until all neurons in the network match their target-degrees with no free stubs left. Self-connections and multiple connections between the same neurons are removed. Then, degree correlations are imposed on the network by a Metropolis algorithm [47], which swaps links according to the in-degrees of the connected neurons. The algorithm randomly selects two links i, j, originating at neurons with in-degrees ki, kj and going into neurons with in-degrees mi, mj (Fig 10). The targets are swapped with probability g if the swap increases the desired in-degree correlations of the network. Respectively, the targets are swapped at random with probability 1 − g, which reduces existing degree correlations in the network. Thus, the strength of degree correlations can be adjusted by setting a value of g between 0 (uncorrelated) and 1 (maximally correlated).


Degree Correlations Optimize Neuronal Network Sensitivity to Sub-Threshold Stimuli.

Schmeltzer C, Kihara AH, Sokolov IM, Rüdiger S - PLoS ONE (2015)

Schematic of a link swap that increases assortative in-degree correlations in the network without changing the in- and out-degrees of the nodes.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4482728&req=5

pone.0121794.g010: Schematic of a link swap that increases assortative in-degree correlations in the network without changing the in- and out-degrees of the nodes.
Mentions: For construction of the model network we employ the configuration model of Newman, Strogatz, and Watts [46], which creates random networks with the desired in- and out-degree distribution. In short, the algorithm works as follows: Each neuron of the network is assigned a target in- and out-degree, drawn from the desired degree distribution. The target-degrees of a neuron can be regarded as a number ingoing and outgoing stubs. The algorithm successively connects randomly selected in- and out-stubs until all neurons in the network match their target-degrees with no free stubs left. Self-connections and multiple connections between the same neurons are removed. Then, degree correlations are imposed on the network by a Metropolis algorithm [47], which swaps links according to the in-degrees of the connected neurons. The algorithm randomly selects two links i, j, originating at neurons with in-degrees ki, kj and going into neurons with in-degrees mi, mj (Fig 10). The targets are swapped with probability g if the swap increases the desired in-degree correlations of the network. Respectively, the targets are swapped at random with probability 1 − g, which reduces existing degree correlations in the network. Thus, the strength of degree correlations can be adjusted by setting a value of g between 0 (uncorrelated) and 1 (maximally correlated).

Bottom Line: We devise a method to calculate firing rates from a self-consistent system of equations taking into account the degree distribution and degree correlations in the network.We show that assortative degree correlations strongly improve the sensitivity for weak stimuli and propose that such networks possess an advantage in signal processing.We moreover find that there exists an optimum in assortativity at an intermediate level leading to a maximum in input/output mutual information.

View Article: PubMed Central - PubMed

Affiliation: Institut für Physik, Humboldt-Universität zu Berlin, Germany.

ABSTRACT
Information processing in the brain crucially depends on the topology of the neuronal connections. We investigate how the topology influences the response of a population of leaky integrate-and-fire neurons to a stimulus. We devise a method to calculate firing rates from a self-consistent system of equations taking into account the degree distribution and degree correlations in the network. We show that assortative degree correlations strongly improve the sensitivity for weak stimuli and propose that such networks possess an advantage in signal processing. We moreover find that there exists an optimum in assortativity at an intermediate level leading to a maximum in input/output mutual information.

No MeSH data available.