Limits...
Degree Correlations Optimize Neuronal Network Sensitivity to Sub-Threshold Stimuli.

Schmeltzer C, Kihara AH, Sokolov IM, Rüdiger S - PLoS ONE (2015)

Bottom Line: We devise a method to calculate firing rates from a self-consistent system of equations taking into account the degree distribution and degree correlations in the network.We show that assortative degree correlations strongly improve the sensitivity for weak stimuli and propose that such networks possess an advantage in signal processing.We moreover find that there exists an optimum in assortativity at an intermediate level leading to a maximum in input/output mutual information.

View Article: PubMed Central - PubMed

Affiliation: Institut für Physik, Humboldt-Universität zu Berlin, Germany.

ABSTRACT
Information processing in the brain crucially depends on the topology of the neuronal connections. We investigate how the topology influences the response of a population of leaky integrate-and-fire neurons to a stimulus. We devise a method to calculate firing rates from a self-consistent system of equations taking into account the degree distribution and degree correlations in the network. We show that assortative degree correlations strongly improve the sensitivity for weak stimuli and propose that such networks possess an advantage in signal processing. We moreover find that there exists an optimum in assortativity at an intermediate level leading to a maximum in input/output mutual information.

No MeSH data available.


Related in: MedlinePlus

Information transfer of the correlated networks.(Top row) Mutual information I = H − Hnoise of the input/output relation of the networks. (Bottom row) Entropy H (stars) and noise entropy Hnoise (open symbols). (A) The network with large negative exponent α = −2.3 has an optimum in information transfer for slightly higher value of p compared to the other networks. Additionally, its signal transmission capabilities are poorer which is characterized by lower values of I that result from small entropy H due to low firing rates. (B) The network with intermediate exponent α = −2.3 has its signal transmission optimized at an intermediate value of assortativity, p ≃ 0.6. (C) The network with small negative exponent α = −1.7 exhibits the most efficient signal transmission. The mutual information peaks for a relatively low value of assortativity p ≃ 0.4, which means that the uncorrelated network is already quite efficient in signal transmission.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4482728&req=5

pone.0121794.g009: Information transfer of the correlated networks.(Top row) Mutual information I = H − Hnoise of the input/output relation of the networks. (Bottom row) Entropy H (stars) and noise entropy Hnoise (open symbols). (A) The network with large negative exponent α = −2.3 has an optimum in information transfer for slightly higher value of p compared to the other networks. Additionally, its signal transmission capabilities are poorer which is characterized by lower values of I that result from small entropy H due to low firing rates. (B) The network with intermediate exponent α = −2.3 has its signal transmission optimized at an intermediate value of assortativity, p ≃ 0.6. (C) The network with small negative exponent α = −1.7 exhibits the most efficient signal transmission. The mutual information peaks for a relatively low value of assortativity p ≃ 0.4, which means that the uncorrelated network is already quite efficient in signal transmission.

Mentions: In the following we compare the three networks with respect to their input-output mutual information (Fig 9). First, the maximum value of mutual information increases slightly for increasing α (top row in Fig 9), primarily because increasing α increases the mean degree, leading to larger recurrent activity and higher firing rates of the neurons. Increased firing rates correspond to a larger response variability of the networks, e.g. the range of mean responses for α = −2.3 (0–35 Hz) is smaller than for α = −2 (0–50 Hz), Fig 8A and 8B. However, increasing α also increases the variance of the degree distribution, and hence, increases the noise of the network response to a signal. The larger response variability and noise are represented by a larger entropy H and noise entropy Hnoise, respectively (bottom row in Fig 9). Second, increasing α shifts the peak position of the mutual information to lower p. This indicates that less assortativity is needed for networks with larger mean-degree and variance of the degree distribution to optimize the information transfer. We conclude that there is a range of network configurations that have their signal transmission optimized by assortative degree correlations, but the optimal level of assortativity depends on the degree distribution of the network. Finally, the optimum vanishes for degree distributions with extremely small or extremely large mean and variance (data not shown). In the former case the mutual information of assortative networks is greatly reduced by exceeding noise levels, because only very few strongly connected neurons sustain firing for small inputs and the vast majority of neurons do not fire. In the latter case, most of the neurons already sustain firing for low inputs in the uncorrelated network, so that assortative degree correlations do not increase the response variability.


Degree Correlations Optimize Neuronal Network Sensitivity to Sub-Threshold Stimuli.

Schmeltzer C, Kihara AH, Sokolov IM, Rüdiger S - PLoS ONE (2015)

Information transfer of the correlated networks.(Top row) Mutual information I = H − Hnoise of the input/output relation of the networks. (Bottom row) Entropy H (stars) and noise entropy Hnoise (open symbols). (A) The network with large negative exponent α = −2.3 has an optimum in information transfer for slightly higher value of p compared to the other networks. Additionally, its signal transmission capabilities are poorer which is characterized by lower values of I that result from small entropy H due to low firing rates. (B) The network with intermediate exponent α = −2.3 has its signal transmission optimized at an intermediate value of assortativity, p ≃ 0.6. (C) The network with small negative exponent α = −1.7 exhibits the most efficient signal transmission. The mutual information peaks for a relatively low value of assortativity p ≃ 0.4, which means that the uncorrelated network is already quite efficient in signal transmission.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4482728&req=5

pone.0121794.g009: Information transfer of the correlated networks.(Top row) Mutual information I = H − Hnoise of the input/output relation of the networks. (Bottom row) Entropy H (stars) and noise entropy Hnoise (open symbols). (A) The network with large negative exponent α = −2.3 has an optimum in information transfer for slightly higher value of p compared to the other networks. Additionally, its signal transmission capabilities are poorer which is characterized by lower values of I that result from small entropy H due to low firing rates. (B) The network with intermediate exponent α = −2.3 has its signal transmission optimized at an intermediate value of assortativity, p ≃ 0.6. (C) The network with small negative exponent α = −1.7 exhibits the most efficient signal transmission. The mutual information peaks for a relatively low value of assortativity p ≃ 0.4, which means that the uncorrelated network is already quite efficient in signal transmission.
Mentions: In the following we compare the three networks with respect to their input-output mutual information (Fig 9). First, the maximum value of mutual information increases slightly for increasing α (top row in Fig 9), primarily because increasing α increases the mean degree, leading to larger recurrent activity and higher firing rates of the neurons. Increased firing rates correspond to a larger response variability of the networks, e.g. the range of mean responses for α = −2.3 (0–35 Hz) is smaller than for α = −2 (0–50 Hz), Fig 8A and 8B. However, increasing α also increases the variance of the degree distribution, and hence, increases the noise of the network response to a signal. The larger response variability and noise are represented by a larger entropy H and noise entropy Hnoise, respectively (bottom row in Fig 9). Second, increasing α shifts the peak position of the mutual information to lower p. This indicates that less assortativity is needed for networks with larger mean-degree and variance of the degree distribution to optimize the information transfer. We conclude that there is a range of network configurations that have their signal transmission optimized by assortative degree correlations, but the optimal level of assortativity depends on the degree distribution of the network. Finally, the optimum vanishes for degree distributions with extremely small or extremely large mean and variance (data not shown). In the former case the mutual information of assortative networks is greatly reduced by exceeding noise levels, because only very few strongly connected neurons sustain firing for small inputs and the vast majority of neurons do not fire. In the latter case, most of the neurons already sustain firing for low inputs in the uncorrelated network, so that assortative degree correlations do not increase the response variability.

Bottom Line: We devise a method to calculate firing rates from a self-consistent system of equations taking into account the degree distribution and degree correlations in the network.We show that assortative degree correlations strongly improve the sensitivity for weak stimuli and propose that such networks possess an advantage in signal processing.We moreover find that there exists an optimum in assortativity at an intermediate level leading to a maximum in input/output mutual information.

View Article: PubMed Central - PubMed

Affiliation: Institut für Physik, Humboldt-Universität zu Berlin, Germany.

ABSTRACT
Information processing in the brain crucially depends on the topology of the neuronal connections. We investigate how the topology influences the response of a population of leaky integrate-and-fire neurons to a stimulus. We devise a method to calculate firing rates from a self-consistent system of equations taking into account the degree distribution and degree correlations in the network. We show that assortative degree correlations strongly improve the sensitivity for weak stimuli and propose that such networks possess an advantage in signal processing. We moreover find that there exists an optimum in assortativity at an intermediate level leading to a maximum in input/output mutual information.

No MeSH data available.


Related in: MedlinePlus