Limits...
Estimating Neuronal Information: Logarithmic Binning of Neuronal Inter-Spike Intervals.

Dorval AD - Entropy (Basel) (2011)

Bottom Line: Thus, discretizing the logarithm of inter-spike intervals, as compared to the inter-spike intervals themselves, yields histograms that enable more accurate entropy and information estimates for fewer bins and less data.Additionally, as distribution parameters vary, the entropy and information calculated from the logarithm of the inter-spike intervals are substantially better behaved, e.g., entropy is independent of mean rate, and information is equally affected by rate gains and divisions.Thus, when compiling neuronal data for subsequent information analysis, the logarithm of the inter-spike intervals is preferred, over the untransformed inter-spike intervals, because it yields better information estimates and is likely more similar to the construction used by nature herself.

View Article: PubMed Central - PubMed

Affiliation: Department of Bioengineering and the Brain Institute, University of Utah, Salt Lake City, UT 84108, USA.

ABSTRACT
Neurons communicate via the relative timing of all-or-none biophysical signals called spikes. For statistical analysis, the time between spikes can be accumulated into inter-spike interval histograms. Information theoretic measures have been estimated from these histograms to assess how information varies across organisms, neural systems, and disease conditions. Because neurons are computational units that, to the extent they process time, work not by discrete clock ticks but by the exponential decays of numerous intrinsic variables, we propose that neuronal information measures scale more naturally with the logarithm of time. For the types of inter-spike interval distributions that best describe neuronal activity, the logarithm of time enables fewer bins to capture the salient features of the distributions. Thus, discretizing the logarithm of inter-spike intervals, as compared to the inter-spike intervals themselves, yields histograms that enable more accurate entropy and information estimates for fewer bins and less data. Additionally, as distribution parameters vary, the entropy and information calculated from the logarithm of the inter-spike intervals are substantially better behaved, e.g., entropy is independent of mean rate, and information is equally affected by rate gains and divisions. Thus, when compiling neuronal data for subsequent information analysis, the logarithm of the inter-spike intervals is preferred, over the untransformed inter-spike intervals, because it yields better information estimates and is likely more similar to the construction used by nature herself.

No MeSH data available.


Related in: MedlinePlus

The linear (left) and logarithmic (right) binned PDFs for four parameterizations of each distribution, identified by distribution name and a letter (a–d) corresponding to the identifiers provided in Table 3. Note pure shifts in the logarithmic PDFs of a:b and c:d.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4020285&req=5

Figure 2: The linear (left) and logarithmic (right) binned PDFs for four parameterizations of each distribution, identified by distribution name and a letter (a–d) corresponding to the identifiers provided in Table 3. Note pure shifts in the logarithmic PDFs of a:b and c:d.

Mentions: As before, the linear and logarithmic PDFs were divided into 100 bins (Figure 2). Note that the distributions with equivalent CVs [i.e., (a):(b) and (c):(d)] have logarithmic PDFs with identical shapes shifted along the abscissa. The CV quantifies the width of the logarithmic PDF exactly as the standard deviation quantifies the width of the linear PDF. If the distribution variabilities were independent of their means, the linear PDFs would shift with standard deviation as do logarithmic PDFs with CV. But, distributions with equal standard deviations (i.e., (a):(d)) do not have identically shaped linear PDFs.


Estimating Neuronal Information: Logarithmic Binning of Neuronal Inter-Spike Intervals.

Dorval AD - Entropy (Basel) (2011)

The linear (left) and logarithmic (right) binned PDFs for four parameterizations of each distribution, identified by distribution name and a letter (a–d) corresponding to the identifiers provided in Table 3. Note pure shifts in the logarithmic PDFs of a:b and c:d.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4020285&req=5

Figure 2: The linear (left) and logarithmic (right) binned PDFs for four parameterizations of each distribution, identified by distribution name and a letter (a–d) corresponding to the identifiers provided in Table 3. Note pure shifts in the logarithmic PDFs of a:b and c:d.
Mentions: As before, the linear and logarithmic PDFs were divided into 100 bins (Figure 2). Note that the distributions with equivalent CVs [i.e., (a):(b) and (c):(d)] have logarithmic PDFs with identical shapes shifted along the abscissa. The CV quantifies the width of the logarithmic PDF exactly as the standard deviation quantifies the width of the linear PDF. If the distribution variabilities were independent of their means, the linear PDFs would shift with standard deviation as do logarithmic PDFs with CV. But, distributions with equal standard deviations (i.e., (a):(d)) do not have identically shaped linear PDFs.

Bottom Line: Thus, discretizing the logarithm of inter-spike intervals, as compared to the inter-spike intervals themselves, yields histograms that enable more accurate entropy and information estimates for fewer bins and less data.Additionally, as distribution parameters vary, the entropy and information calculated from the logarithm of the inter-spike intervals are substantially better behaved, e.g., entropy is independent of mean rate, and information is equally affected by rate gains and divisions.Thus, when compiling neuronal data for subsequent information analysis, the logarithm of the inter-spike intervals is preferred, over the untransformed inter-spike intervals, because it yields better information estimates and is likely more similar to the construction used by nature herself.

View Article: PubMed Central - PubMed

Affiliation: Department of Bioengineering and the Brain Institute, University of Utah, Salt Lake City, UT 84108, USA.

ABSTRACT
Neurons communicate via the relative timing of all-or-none biophysical signals called spikes. For statistical analysis, the time between spikes can be accumulated into inter-spike interval histograms. Information theoretic measures have been estimated from these histograms to assess how information varies across organisms, neural systems, and disease conditions. Because neurons are computational units that, to the extent they process time, work not by discrete clock ticks but by the exponential decays of numerous intrinsic variables, we propose that neuronal information measures scale more naturally with the logarithm of time. For the types of inter-spike interval distributions that best describe neuronal activity, the logarithm of time enables fewer bins to capture the salient features of the distributions. Thus, discretizing the logarithm of inter-spike intervals, as compared to the inter-spike intervals themselves, yields histograms that enable more accurate entropy and information estimates for fewer bins and less data. Additionally, as distribution parameters vary, the entropy and information calculated from the logarithm of the inter-spike intervals are substantially better behaved, e.g., entropy is independent of mean rate, and information is equally affected by rate gains and divisions. Thus, when compiling neuronal data for subsequent information analysis, the logarithm of the inter-spike intervals is preferred, over the untransformed inter-spike intervals, because it yields better information estimates and is likely more similar to the construction used by nature herself.

No MeSH data available.


Related in: MedlinePlus