Limits...
Estimating Neuronal Information: Logarithmic Binning of Neuronal Inter-Spike Intervals.

Dorval AD - Entropy (Basel) (2011)

Bottom Line: Thus, discretizing the logarithm of inter-spike intervals, as compared to the inter-spike intervals themselves, yields histograms that enable more accurate entropy and information estimates for fewer bins and less data.Additionally, as distribution parameters vary, the entropy and information calculated from the logarithm of the inter-spike intervals are substantially better behaved, e.g., entropy is independent of mean rate, and information is equally affected by rate gains and divisions.Thus, when compiling neuronal data for subsequent information analysis, the logarithm of the inter-spike intervals is preferred, over the untransformed inter-spike intervals, because it yields better information estimates and is likely more similar to the construction used by nature herself.

View Article: PubMed Central - PubMed

Affiliation: Department of Bioengineering and the Brain Institute, University of Utah, Salt Lake City, UT 84108, USA.

ABSTRACT
Neurons communicate via the relative timing of all-or-none biophysical signals called spikes. For statistical analysis, the time between spikes can be accumulated into inter-spike interval histograms. Information theoretic measures have been estimated from these histograms to assess how information varies across organisms, neural systems, and disease conditions. Because neurons are computational units that, to the extent they process time, work not by discrete clock ticks but by the exponential decays of numerous intrinsic variables, we propose that neuronal information measures scale more naturally with the logarithm of time. For the types of inter-spike interval distributions that best describe neuronal activity, the logarithm of time enables fewer bins to capture the salient features of the distributions. Thus, discretizing the logarithm of inter-spike intervals, as compared to the inter-spike intervals themselves, yields histograms that enable more accurate entropy and information estimates for fewer bins and less data. Additionally, as distribution parameters vary, the entropy and information calculated from the logarithm of the inter-spike intervals are substantially better behaved, e.g., entropy is independent of mean rate, and information is equally affected by rate gains and divisions. Thus, when compiling neuronal data for subsequent information analysis, the logarithm of the inter-spike intervals is preferred, over the untransformed inter-spike intervals, because it yields better information estimates and is likely more similar to the construction used by nature herself.

No MeSH data available.


Related in: MedlinePlus

Entropy (top) and information (bottom) calculations as a function of ISI mean for the linear (grey-black) and logarithmic (red-brown) PDFs discretized into 100 bins each. Circle markers denote the parameterizations from Table 3 (a) and (b). (a). Entropies of the logarithmic calculations are independent of mean ISI; (b). Information about an input selecting for the 25 ms mean parameterization (a) or another distribution with 1/10th to 10 times the mean ISI. Linear PDFs yield inferior information for short ISIs.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4020285&req=5

Figure 4: Entropy (top) and information (bottom) calculations as a function of ISI mean for the linear (grey-black) and logarithmic (red-brown) PDFs discretized into 100 bins each. Circle markers denote the parameterizations from Table 3 (a) and (b). (a). Entropies of the logarithmic calculations are independent of mean ISI; (b). Information about an input selecting for the 25 ms mean parameterization (a) or another distribution with 1/10th to 10 times the mean ISI. Linear PDFs yield inferior information for short ISIs.

Mentions: To begin we assessed how maintaining a constant CV while spanning the ISI mean across two orders of magnitude affected entropy and information measures. To generate experimental parameterizations, the CVs of all four distributions were held at their original values (a), while the means were varied from 2.5 ms to 250 ms; recall that the original distributions had means of 25 ms. To achieve this range of means, the scale parameters varied as such: power law t0 ranged from 1.5 to 150 ms; exponential t1 ranged from 1 to 100 ms where λ equaled 2/3t1; gamma θ ranged from 5/8 to 500/8 ms; and periodic log-normal μ ranged from 1 to 100 ms. Entropy was calculated from both the linear and logarithmic PDFs for all values (Figure 4, top). Sixteen circle markers identify parameterizations (a) and (b) from Table 3 for both discretizations of all four distributions (Upon close inspection, the markers in figure 4 match the values reported in Table 3, save for the entropies from the linear PDFs of the power law distribution; Table 3 was constructed with bins starting at 0.1 ms, whereas Figures 3–5 were averaged from dithered bins. The alignment of t0 with the bin edges has a profound effect on power law entropy). While entropies of the linear PDFs increased approximately linearly with the logarithm of the ISI mean, entropies of the logarithmic PDFs were independent of mean, suggesting time-warp invariance.


Estimating Neuronal Information: Logarithmic Binning of Neuronal Inter-Spike Intervals.

Dorval AD - Entropy (Basel) (2011)

Entropy (top) and information (bottom) calculations as a function of ISI mean for the linear (grey-black) and logarithmic (red-brown) PDFs discretized into 100 bins each. Circle markers denote the parameterizations from Table 3 (a) and (b). (a). Entropies of the logarithmic calculations are independent of mean ISI; (b). Information about an input selecting for the 25 ms mean parameterization (a) or another distribution with 1/10th to 10 times the mean ISI. Linear PDFs yield inferior information for short ISIs.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4020285&req=5

Figure 4: Entropy (top) and information (bottom) calculations as a function of ISI mean for the linear (grey-black) and logarithmic (red-brown) PDFs discretized into 100 bins each. Circle markers denote the parameterizations from Table 3 (a) and (b). (a). Entropies of the logarithmic calculations are independent of mean ISI; (b). Information about an input selecting for the 25 ms mean parameterization (a) or another distribution with 1/10th to 10 times the mean ISI. Linear PDFs yield inferior information for short ISIs.
Mentions: To begin we assessed how maintaining a constant CV while spanning the ISI mean across two orders of magnitude affected entropy and information measures. To generate experimental parameterizations, the CVs of all four distributions were held at their original values (a), while the means were varied from 2.5 ms to 250 ms; recall that the original distributions had means of 25 ms. To achieve this range of means, the scale parameters varied as such: power law t0 ranged from 1.5 to 150 ms; exponential t1 ranged from 1 to 100 ms where λ equaled 2/3t1; gamma θ ranged from 5/8 to 500/8 ms; and periodic log-normal μ ranged from 1 to 100 ms. Entropy was calculated from both the linear and logarithmic PDFs for all values (Figure 4, top). Sixteen circle markers identify parameterizations (a) and (b) from Table 3 for both discretizations of all four distributions (Upon close inspection, the markers in figure 4 match the values reported in Table 3, save for the entropies from the linear PDFs of the power law distribution; Table 3 was constructed with bins starting at 0.1 ms, whereas Figures 3–5 were averaged from dithered bins. The alignment of t0 with the bin edges has a profound effect on power law entropy). While entropies of the linear PDFs increased approximately linearly with the logarithm of the ISI mean, entropies of the logarithmic PDFs were independent of mean, suggesting time-warp invariance.

Bottom Line: Thus, discretizing the logarithm of inter-spike intervals, as compared to the inter-spike intervals themselves, yields histograms that enable more accurate entropy and information estimates for fewer bins and less data.Additionally, as distribution parameters vary, the entropy and information calculated from the logarithm of the inter-spike intervals are substantially better behaved, e.g., entropy is independent of mean rate, and information is equally affected by rate gains and divisions.Thus, when compiling neuronal data for subsequent information analysis, the logarithm of the inter-spike intervals is preferred, over the untransformed inter-spike intervals, because it yields better information estimates and is likely more similar to the construction used by nature herself.

View Article: PubMed Central - PubMed

Affiliation: Department of Bioengineering and the Brain Institute, University of Utah, Salt Lake City, UT 84108, USA.

ABSTRACT
Neurons communicate via the relative timing of all-or-none biophysical signals called spikes. For statistical analysis, the time between spikes can be accumulated into inter-spike interval histograms. Information theoretic measures have been estimated from these histograms to assess how information varies across organisms, neural systems, and disease conditions. Because neurons are computational units that, to the extent they process time, work not by discrete clock ticks but by the exponential decays of numerous intrinsic variables, we propose that neuronal information measures scale more naturally with the logarithm of time. For the types of inter-spike interval distributions that best describe neuronal activity, the logarithm of time enables fewer bins to capture the salient features of the distributions. Thus, discretizing the logarithm of inter-spike intervals, as compared to the inter-spike intervals themselves, yields histograms that enable more accurate entropy and information estimates for fewer bins and less data. Additionally, as distribution parameters vary, the entropy and information calculated from the logarithm of the inter-spike intervals are substantially better behaved, e.g., entropy is independent of mean rate, and information is equally affected by rate gains and divisions. Thus, when compiling neuronal data for subsequent information analysis, the logarithm of the inter-spike intervals is preferred, over the untransformed inter-spike intervals, because it yields better information estimates and is likely more similar to the construction used by nature herself.

No MeSH data available.


Related in: MedlinePlus