Limits...
Intrinsic neuronal properties switch the mode of information transmission in networks.

Gjorgjieva J, Mease RA, Moody WJ, Fairhall AL - PLoS Comput. Biol. (2014)

Bottom Line: Depending on the neurons' intrinsic properties, noise plays different roles in modulating neuronal input-output curves, which can dramatically impact network transmission.The developmental change in intrinsic properties supports a transformation of a networks function from the propagation of network-wide information to one in which computations are scaled to local activity.This work underscores the significance of simple changes in conductance parameters in governing how neurons represent and propagate information, and suggests a role for background synaptic noise in switching the mode of information transmission.

View Article: PubMed Central - PubMed

Affiliation: Center for Brain Science, Harvard University, Cambridge, Massachusetts, United States of America.

ABSTRACT
Diverse ion channels and their dynamics endow single neurons with complex biophysical properties. These properties determine the heterogeneity of cell types that make up the brain, as constituents of neural circuits tuned to perform highly specific computations. How do biophysical properties of single neurons impact network function? We study a set of biophysical properties that emerge in cortical neurons during the first week of development, eventually allowing these neurons to adaptively scale the gain of their response to the amplitude of the fluctuations they encounter. During the same time period, these same neurons participate in large-scale waves of spontaneously generated electrical activity. We investigate the potential role of experimentally observed changes in intrinsic neuronal properties in determining the ability of cortical networks to propagate waves of activity. We show that such changes can strongly affect the ability of multi-layered feedforward networks to represent and transmit information on multiple timescales. With properties modeled on those observed at early stages of development, neurons are relatively insensitive to rapid fluctuations and tend to fire synchronously in response to wave-like events of large amplitude. Following developmental changes in voltage-dependent conductances, these same neurons become efficient encoders of fast input fluctuations over few layers, but lose the ability to transmit slower, population-wide input variations across many layers. Depending on the neurons' intrinsic properties, noise plays different roles in modulating neuronal input-output curves, which can dramatically impact network transmission. The developmental change in intrinsic properties supports a transformation of a networks function from the propagation of network-wide information to one in which computations are scaled to local activity. This work underscores the significance of simple changes in conductance parameters in governing how neurons represent and propagate information, and suggests a role for background synaptic noise in switching the mode of information transmission.

Show MeSH

Related in: MedlinePlus

Mutual information about the mean stimulus transmitted by GS and NGS networks.The mutual information as function of layer number for A. weakly connected GS ( pS/µm2 and  pS/µm2), B. strongly connected NGS ( pS/µm2 and  pS/µm2) and C. strongly connected GS networks ( pS/µm2 and  pS/µm2) as shown in Figure 6 for different noise levels indicated by the shade of gray.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4256072&req=5

pcbi-1003962-g007: Mutual information about the mean stimulus transmitted by GS and NGS networks.The mutual information as function of layer number for A. weakly connected GS ( pS/µm2 and pS/µm2), B. strongly connected NGS ( pS/µm2 and pS/µm2) and C. strongly connected GS networks ( pS/µm2 and pS/µm2) as shown in Figure 6 for different noise levels indicated by the shade of gray.

Mentions: Given the predicted signal propagation dynamics, we now directly compute the mutual information between the mean DC input injected into layer 1 and the population firing rates at a given layer for each magnitude of the independent noise (Figure 7). This measures how distinguishable network firing rate outputs at each layer are for different initial mean inputs. The convergence of population firing rates across layers to a single value in the GS networks leads to a drop in information towards zero for both the weakly (Figure 6A,B) and strongly connected GS networks (Figure 6E,F) as a function of layer number and for a wide range of network noise (Figure 7A,C). NGS networks can transmit a range of mean DC inputs without distortion (Figure 6C,D); thus, the information between input DC and population firing rate remains relatively constant in subsequent layers (Figure 7B). The information slightly increases in deeper layers due to the emergence of synchronization, which locks the network output into a specific distribution of population firing rates. As noise amplitude increases, the selected – curve becomes tangent to the linear input-output relationship over a larger range of input firing rates (Figure 6C,D); hence, a larger range of inputs is stably transmitted across network layers. Counterintuitively, this suggests that increasing noise in the NGS networks can serve to increase the information such networks carry about a distribution of mean inputs.


Intrinsic neuronal properties switch the mode of information transmission in networks.

Gjorgjieva J, Mease RA, Moody WJ, Fairhall AL - PLoS Comput. Biol. (2014)

Mutual information about the mean stimulus transmitted by GS and NGS networks.The mutual information as function of layer number for A. weakly connected GS ( pS/µm2 and  pS/µm2), B. strongly connected NGS ( pS/µm2 and  pS/µm2) and C. strongly connected GS networks ( pS/µm2 and  pS/µm2) as shown in Figure 6 for different noise levels indicated by the shade of gray.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4256072&req=5

pcbi-1003962-g007: Mutual information about the mean stimulus transmitted by GS and NGS networks.The mutual information as function of layer number for A. weakly connected GS ( pS/µm2 and pS/µm2), B. strongly connected NGS ( pS/µm2 and pS/µm2) and C. strongly connected GS networks ( pS/µm2 and pS/µm2) as shown in Figure 6 for different noise levels indicated by the shade of gray.
Mentions: Given the predicted signal propagation dynamics, we now directly compute the mutual information between the mean DC input injected into layer 1 and the population firing rates at a given layer for each magnitude of the independent noise (Figure 7). This measures how distinguishable network firing rate outputs at each layer are for different initial mean inputs. The convergence of population firing rates across layers to a single value in the GS networks leads to a drop in information towards zero for both the weakly (Figure 6A,B) and strongly connected GS networks (Figure 6E,F) as a function of layer number and for a wide range of network noise (Figure 7A,C). NGS networks can transmit a range of mean DC inputs without distortion (Figure 6C,D); thus, the information between input DC and population firing rate remains relatively constant in subsequent layers (Figure 7B). The information slightly increases in deeper layers due to the emergence of synchronization, which locks the network output into a specific distribution of population firing rates. As noise amplitude increases, the selected – curve becomes tangent to the linear input-output relationship over a larger range of input firing rates (Figure 6C,D); hence, a larger range of inputs is stably transmitted across network layers. Counterintuitively, this suggests that increasing noise in the NGS networks can serve to increase the information such networks carry about a distribution of mean inputs.

Bottom Line: Depending on the neurons' intrinsic properties, noise plays different roles in modulating neuronal input-output curves, which can dramatically impact network transmission.The developmental change in intrinsic properties supports a transformation of a networks function from the propagation of network-wide information to one in which computations are scaled to local activity.This work underscores the significance of simple changes in conductance parameters in governing how neurons represent and propagate information, and suggests a role for background synaptic noise in switching the mode of information transmission.

View Article: PubMed Central - PubMed

Affiliation: Center for Brain Science, Harvard University, Cambridge, Massachusetts, United States of America.

ABSTRACT
Diverse ion channels and their dynamics endow single neurons with complex biophysical properties. These properties determine the heterogeneity of cell types that make up the brain, as constituents of neural circuits tuned to perform highly specific computations. How do biophysical properties of single neurons impact network function? We study a set of biophysical properties that emerge in cortical neurons during the first week of development, eventually allowing these neurons to adaptively scale the gain of their response to the amplitude of the fluctuations they encounter. During the same time period, these same neurons participate in large-scale waves of spontaneously generated electrical activity. We investigate the potential role of experimentally observed changes in intrinsic neuronal properties in determining the ability of cortical networks to propagate waves of activity. We show that such changes can strongly affect the ability of multi-layered feedforward networks to represent and transmit information on multiple timescales. With properties modeled on those observed at early stages of development, neurons are relatively insensitive to rapid fluctuations and tend to fire synchronously in response to wave-like events of large amplitude. Following developmental changes in voltage-dependent conductances, these same neurons become efficient encoders of fast input fluctuations over few layers, but lose the ability to transmit slower, population-wide input variations across many layers. Depending on the neurons' intrinsic properties, noise plays different roles in modulating neuronal input-output curves, which can dramatically impact network transmission. The developmental change in intrinsic properties supports a transformation of a networks function from the propagation of network-wide information to one in which computations are scaled to local activity. This work underscores the significance of simple changes in conductance parameters in governing how neurons represent and propagate information, and suggests a role for background synaptic noise in switching the mode of information transmission.

Show MeSH
Related in: MedlinePlus