Limits...
Scalability of Asynchronous Networks Is Limited by One-to-One Mapping between Effective Connectivity and Correlations.

van Albada SJ, Helias M, Diesmann M - PLoS Comput. Biol. (2015)

Bottom Line: The one-to-one correspondence between effective connectivity and the temporal structure of pairwise averaged correlations implies that network scalings should preserve the effective connectivity if pairwise averaged correlations are to be held constant.Changes in effective connectivity can even push a network from a linearly stable to an unstable, oscillatory regime and vice versa.Our results therefore show that the reducibility of asynchronous networks is fundamentally limited.

View Article: PubMed Central - PubMed

Affiliation: Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.

ABSTRACT
Network models are routinely downscaled compared to nature in terms of numbers of nodes or edges because of a lack of computational resources, often without explicit mention of the limitations this entails. While reliable methods have long existed to adjust parameters such that the first-order statistics of network dynamics are conserved, here we show that limitations already arise if also second-order statistics are to be maintained. The temporal structure of pairwise averaged correlations in the activity of recurrent networks is determined by the effective population-level connectivity. We first show that in general the converse is also true and explicitly mention degenerate cases when this one-to-one relationship does not hold. The one-to-one correspondence between effective connectivity and the temporal structure of pairwise averaged correlations implies that network scalings should preserve the effective connectivity if pairwise averaged correlations are to be held constant. Changes in effective connectivity can even push a network from a linearly stable to an unstable, oscillatory regime and vice versa. On this basis, we derive conditions for the preservation of both mean population-averaged activities and pairwise averaged correlations under a change in numbers of neurons or synapses in the asynchronous regime typical of cortical networks. We find that mean activities and correlation structure can be maintained by an appropriate scaling of the synaptic weights, but only over a range of numbers of synapses that is limited by the variance of external inputs to the network. Our results therefore show that the reducibility of asynchronous networks is fundamentally limited.

No MeSH data available.


Related in: MedlinePlus

Transforming synaptic strengths J with the square root of the number of incoming synapses per neuron K (the in-degree) upon scaling of network size N changes correlation structure when mean and variance of the input current are maintained.A reference network of 10,000 inhibitory leaky integrate-and-fire neurons is scaled up to 50,000 neurons, fixing the connection probability and adjusting the external Poisson drive to keep the mean and variance of total (external plus internal) inputs fixed. Single-neuron parameters and connection probability are as in Table 2. Delays are 1 ms, mean and standard deviation of total inputs are 15 mV and 10 mV, respectively, and the reference network has J = 0.1 mV. Each network is simulated for 50 s. A Onset of oscillations induced by scaling of network size N, visualized by changes in the poles z of the covariance function in the frequency domain. Re(z) determines the frequency of oscillations and Im(z) their damping, such that -Im(z) > 0 means that small deviations from the fixed-point activity of the network grow with time [cf. Eq (76)]. The transformation  preserves the poles, while  induces a Hopf bifurcation so that the scaled network is outside the linearly stable regime. B Covariance in the network where coupling strength J is scaled with the in-degree K matches that in the reference network, whereas large oscillations appear in the network scaled with . Colors as in A.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4556689&req=5

pcbi.1004490.g002: Transforming synaptic strengths J with the square root of the number of incoming synapses per neuron K (the in-degree) upon scaling of network size N changes correlation structure when mean and variance of the input current are maintained.A reference network of 10,000 inhibitory leaky integrate-and-fire neurons is scaled up to 50,000 neurons, fixing the connection probability and adjusting the external Poisson drive to keep the mean and variance of total (external plus internal) inputs fixed. Single-neuron parameters and connection probability are as in Table 2. Delays are 1 ms, mean and standard deviation of total inputs are 15 mV and 10 mV, respectively, and the reference network has J = 0.1 mV. Each network is simulated for 50 s. A Onset of oscillations induced by scaling of network size N, visualized by changes in the poles z of the covariance function in the frequency domain. Re(z) determines the frequency of oscillations and Im(z) their damping, such that -Im(z) > 0 means that small deviations from the fixed-point activity of the network grow with time [cf. Eq (76)]. The transformation preserves the poles, while induces a Hopf bifurcation so that the scaled network is outside the linearly stable regime. B Covariance in the network where coupling strength J is scaled with the in-degree K matches that in the reference network, whereas large oscillations appear in the network scaled with . Colors as in A.

Mentions: A few suggestions have been made for adjusting synaptic weights to numbers of synapses. In the balanced random network model, the asynchronous irregular (AI) firing often observed in cortex is explained by a domination of inhibition which causes a mean membrane potential below spike threshold, and sufficiently large fluctuations that trigger spikes [46]. In order to achieve such an AI state for a large range of network sizes, one choice is to ensure that input fluctuations remain similar in size, and adjust the threshold or a DC drive to maintain the mean distance to threshold. As fluctuations are proportional to J2K for independent inputs, this suggests the scalingJ∝1K(1)proposed in [46]. Since the mean input to a neuron is proportional to JK, Eq (1) leads, all else being equal, to an increase of the population feedback with , changing the correlation structure of the network, as illustrated in Fig 2 for a simple network of inhibitory leaky integrate-and-fire neurons (note that in this example we fix the connection probability). This suggests the alternative [42, 44, 45]J∝1K,(2)where now the variance of the external drive needs to be adjusted to maintain the total input variance onto neurons in the network.


Scalability of Asynchronous Networks Is Limited by One-to-One Mapping between Effective Connectivity and Correlations.

van Albada SJ, Helias M, Diesmann M - PLoS Comput. Biol. (2015)

Transforming synaptic strengths J with the square root of the number of incoming synapses per neuron K (the in-degree) upon scaling of network size N changes correlation structure when mean and variance of the input current are maintained.A reference network of 10,000 inhibitory leaky integrate-and-fire neurons is scaled up to 50,000 neurons, fixing the connection probability and adjusting the external Poisson drive to keep the mean and variance of total (external plus internal) inputs fixed. Single-neuron parameters and connection probability are as in Table 2. Delays are 1 ms, mean and standard deviation of total inputs are 15 mV and 10 mV, respectively, and the reference network has J = 0.1 mV. Each network is simulated for 50 s. A Onset of oscillations induced by scaling of network size N, visualized by changes in the poles z of the covariance function in the frequency domain. Re(z) determines the frequency of oscillations and Im(z) their damping, such that -Im(z) > 0 means that small deviations from the fixed-point activity of the network grow with time [cf. Eq (76)]. The transformation  preserves the poles, while  induces a Hopf bifurcation so that the scaled network is outside the linearly stable regime. B Covariance in the network where coupling strength J is scaled with the in-degree K matches that in the reference network, whereas large oscillations appear in the network scaled with . Colors as in A.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4556689&req=5

pcbi.1004490.g002: Transforming synaptic strengths J with the square root of the number of incoming synapses per neuron K (the in-degree) upon scaling of network size N changes correlation structure when mean and variance of the input current are maintained.A reference network of 10,000 inhibitory leaky integrate-and-fire neurons is scaled up to 50,000 neurons, fixing the connection probability and adjusting the external Poisson drive to keep the mean and variance of total (external plus internal) inputs fixed. Single-neuron parameters and connection probability are as in Table 2. Delays are 1 ms, mean and standard deviation of total inputs are 15 mV and 10 mV, respectively, and the reference network has J = 0.1 mV. Each network is simulated for 50 s. A Onset of oscillations induced by scaling of network size N, visualized by changes in the poles z of the covariance function in the frequency domain. Re(z) determines the frequency of oscillations and Im(z) their damping, such that -Im(z) > 0 means that small deviations from the fixed-point activity of the network grow with time [cf. Eq (76)]. The transformation preserves the poles, while induces a Hopf bifurcation so that the scaled network is outside the linearly stable regime. B Covariance in the network where coupling strength J is scaled with the in-degree K matches that in the reference network, whereas large oscillations appear in the network scaled with . Colors as in A.
Mentions: A few suggestions have been made for adjusting synaptic weights to numbers of synapses. In the balanced random network model, the asynchronous irregular (AI) firing often observed in cortex is explained by a domination of inhibition which causes a mean membrane potential below spike threshold, and sufficiently large fluctuations that trigger spikes [46]. In order to achieve such an AI state for a large range of network sizes, one choice is to ensure that input fluctuations remain similar in size, and adjust the threshold or a DC drive to maintain the mean distance to threshold. As fluctuations are proportional to J2K for independent inputs, this suggests the scalingJ∝1K(1)proposed in [46]. Since the mean input to a neuron is proportional to JK, Eq (1) leads, all else being equal, to an increase of the population feedback with , changing the correlation structure of the network, as illustrated in Fig 2 for a simple network of inhibitory leaky integrate-and-fire neurons (note that in this example we fix the connection probability). This suggests the alternative [42, 44, 45]J∝1K,(2)where now the variance of the external drive needs to be adjusted to maintain the total input variance onto neurons in the network.

Bottom Line: The one-to-one correspondence between effective connectivity and the temporal structure of pairwise averaged correlations implies that network scalings should preserve the effective connectivity if pairwise averaged correlations are to be held constant.Changes in effective connectivity can even push a network from a linearly stable to an unstable, oscillatory regime and vice versa.Our results therefore show that the reducibility of asynchronous networks is fundamentally limited.

View Article: PubMed Central - PubMed

Affiliation: Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.

ABSTRACT
Network models are routinely downscaled compared to nature in terms of numbers of nodes or edges because of a lack of computational resources, often without explicit mention of the limitations this entails. While reliable methods have long existed to adjust parameters such that the first-order statistics of network dynamics are conserved, here we show that limitations already arise if also second-order statistics are to be maintained. The temporal structure of pairwise averaged correlations in the activity of recurrent networks is determined by the effective population-level connectivity. We first show that in general the converse is also true and explicitly mention degenerate cases when this one-to-one relationship does not hold. The one-to-one correspondence between effective connectivity and the temporal structure of pairwise averaged correlations implies that network scalings should preserve the effective connectivity if pairwise averaged correlations are to be held constant. Changes in effective connectivity can even push a network from a linearly stable to an unstable, oscillatory regime and vice versa. On this basis, we derive conditions for the preservation of both mean population-averaged activities and pairwise averaged correlations under a change in numbers of neurons or synapses in the asynchronous regime typical of cortical networks. We find that mean activities and correlation structure can be maintained by an appropriate scaling of the synaptic weights, but only over a range of numbers of synapses that is limited by the variance of external inputs to the network. Our results therefore show that the reducibility of asynchronous networks is fundamentally limited.

No MeSH data available.


Related in: MedlinePlus