Limits...
Degree Correlations Optimize Neuronal Network Sensitivity to Sub-Threshold Stimuli.

Schmeltzer C, Kihara AH, Sokolov IM, Rüdiger S - PLoS ONE (2015)

Bottom Line: We devise a method to calculate firing rates from a self-consistent system of equations taking into account the degree distribution and degree correlations in the network.We show that assortative degree correlations strongly improve the sensitivity for weak stimuli and propose that such networks possess an advantage in signal processing.We moreover find that there exists an optimum in assortativity at an intermediate level leading to a maximum in input/output mutual information.

View Article: PubMed Central - PubMed

Affiliation: Institut für Physik, Humboldt-Universität zu Berlin, Germany.

ABSTRACT
Information processing in the brain crucially depends on the topology of the neuronal connections. We investigate how the topology influences the response of a population of leaky integrate-and-fire neurons to a stimulus. We devise a method to calculate firing rates from a self-consistent system of equations taking into account the degree distribution and degree correlations in the network. We show that assortative degree correlations strongly improve the sensitivity for weak stimuli and propose that such networks possess an advantage in signal processing. We moreover find that there exists an optimum in assortativity at an intermediate level leading to a maximum in input/output mutual information.

No MeSH data available.


Information transfer of assortative networks.(Top row) Mutual information I = H − Hnoise of the input/output relation is optimized for intermediate assortativity. (Bottom row) Response variability H (stars) and noise entropy Hnoise (open symbols). Network output is quantified by the average firing rate of n randomly chosen neurons. (A) Numerical simulations of networks with N = 105 neurons and (B) calculations with the population-density approach, where Nkk′ is sampled from the adjacency matrices of the correlated networks.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4482728&req=5

pone.0121794.g007: Information transfer of assortative networks.(Top row) Mutual information I = H − Hnoise of the input/output relation is optimized for intermediate assortativity. (Bottom row) Response variability H (stars) and noise entropy Hnoise (open symbols). Network output is quantified by the average firing rate of n randomly chosen neurons. (A) Numerical simulations of networks with N = 105 neurons and (B) calculations with the population-density approach, where Nkk′ is sampled from the adjacency matrices of the correlated networks.

Mentions: We make the simplifying assumption, that all sub-threshold stimuli are equally likely, P(s) = const, 0 < s < 1. Then, follows from the average response function , Fig 4, and the associated sampling noise. We found that I is optimized for networks with intermediate degree of assortativity, p ∼ 0.6 (Fig 7, top row). First, some amount of assortativity increases sensitivity to weak stimuli, which is related to the fact that the response curves in Fig 5 assume a more linear functional dependence. Second, in extremely assortative networks, the network response approaches an almost constant value for different stimuli and thus does not contain much information about the inputs. Additionally, the increase in noise entropy with increasing assortativity reduces I even further (Fig 7, bottom row), indicating that the network response is too noisy to reliably transmit information about the stimulus.


Degree Correlations Optimize Neuronal Network Sensitivity to Sub-Threshold Stimuli.

Schmeltzer C, Kihara AH, Sokolov IM, Rüdiger S - PLoS ONE (2015)

Information transfer of assortative networks.(Top row) Mutual information I = H − Hnoise of the input/output relation is optimized for intermediate assortativity. (Bottom row) Response variability H (stars) and noise entropy Hnoise (open symbols). Network output is quantified by the average firing rate of n randomly chosen neurons. (A) Numerical simulations of networks with N = 105 neurons and (B) calculations with the population-density approach, where Nkk′ is sampled from the adjacency matrices of the correlated networks.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4482728&req=5

pone.0121794.g007: Information transfer of assortative networks.(Top row) Mutual information I = H − Hnoise of the input/output relation is optimized for intermediate assortativity. (Bottom row) Response variability H (stars) and noise entropy Hnoise (open symbols). Network output is quantified by the average firing rate of n randomly chosen neurons. (A) Numerical simulations of networks with N = 105 neurons and (B) calculations with the population-density approach, where Nkk′ is sampled from the adjacency matrices of the correlated networks.
Mentions: We make the simplifying assumption, that all sub-threshold stimuli are equally likely, P(s) = const, 0 < s < 1. Then, follows from the average response function , Fig 4, and the associated sampling noise. We found that I is optimized for networks with intermediate degree of assortativity, p ∼ 0.6 (Fig 7, top row). First, some amount of assortativity increases sensitivity to weak stimuli, which is related to the fact that the response curves in Fig 5 assume a more linear functional dependence. Second, in extremely assortative networks, the network response approaches an almost constant value for different stimuli and thus does not contain much information about the inputs. Additionally, the increase in noise entropy with increasing assortativity reduces I even further (Fig 7, bottom row), indicating that the network response is too noisy to reliably transmit information about the stimulus.

Bottom Line: We devise a method to calculate firing rates from a self-consistent system of equations taking into account the degree distribution and degree correlations in the network.We show that assortative degree correlations strongly improve the sensitivity for weak stimuli and propose that such networks possess an advantage in signal processing.We moreover find that there exists an optimum in assortativity at an intermediate level leading to a maximum in input/output mutual information.

View Article: PubMed Central - PubMed

Affiliation: Institut für Physik, Humboldt-Universität zu Berlin, Germany.

ABSTRACT
Information processing in the brain crucially depends on the topology of the neuronal connections. We investigate how the topology influences the response of a population of leaky integrate-and-fire neurons to a stimulus. We devise a method to calculate firing rates from a self-consistent system of equations taking into account the degree distribution and degree correlations in the network. We show that assortative degree correlations strongly improve the sensitivity for weak stimuli and propose that such networks possess an advantage in signal processing. We moreover find that there exists an optimum in assortativity at an intermediate level leading to a maximum in input/output mutual information.

No MeSH data available.