Limits...
Measuring information-transfer delays.

Wibral M, Pampu N, Priesemann V, Siebenhühner F, Seiwert H, Lindner M, Lizier JT, Vicente R - PLoS ONE (2013)

Bottom Line: In complex networks such as gene networks, traffic systems or brain circuits it is important to understand how long it takes for the different parts of the network to effectively influence one another.We also show the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops.While evaluated on neuroscience data, we expect the estimator to be useful in other fields dealing with network dynamics.

View Article: PubMed Central - PubMed

Affiliation: MEG Unit, Brain Imaging Center, Goethe University, Frankfurt, Germany. wibral@em.uni-frankfurt.de

ABSTRACT
In complex networks such as gene networks, traffic systems or brain circuits it is important to understand how long it takes for the different parts of the network to effectively influence one another. In the brain, for example, axonal delays between brain areas can amount to several tens of milliseconds, adding an intrinsic component to any timing-based processing of information. Inferring neural interaction delays is thus needed to interpret the information transfer revealed by any analysis of directed interactions across brain structures. However, a robust estimation of interaction delays from neural activity faces several challenges if modeling assumptions on interaction mechanisms are wrong or cannot be made. Here, we propose a robust estimator for neuronal interaction delays rooted in an information-theoretic framework, which allows a model-free exploration of interactions. In particular, we extend transfer entropy to account for delayed source-target interactions, while crucially retaining the conditioning on the embedded target state at the immediately previous time step. We prove that this particular extension is indeed guaranteed to identify interaction delays between two coupled systems and is the only relevant option in keeping with Wiener's principle of causality. We demonstrate the performance of our approach in detecting interaction delays on finite data by numerical simulations of stochastic and deterministic processes, as well as on local field potential recordings. We also show the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops. While evaluated on neuroscience data, we expect the estimator to be useful in other fields dealing with network dynamics.

Show MeSH

Related in: MedlinePlus

Test case (Ia), comparison of MIT and TE.Analytic and empirical measurements of (a) Transfer entropy  and (b) Momentary information transfer  as a function of memory noise parameter  for the discrete-valued process with short-term source memory and a delay . Each measure is plotted for delays  (red) and 2 (green). The correct causal interaction delay coorsponds  and therefore we expect an appropriate measure to always return a higher value with  than with , i.e the red curve should always be at higher values than the green curve. Nevertheless, there is potential for  to be identified erroneously as the delay due to the presence of memory in the source , and MIT indeed finds this result for a range of the memory noise parameter  (below .1).
© Copyright Policy
Related In: Results  -  Collection


getmorefigures.php?uid=PMC3585400&req=5

pone-0055809-g003: Test case (Ia), comparison of MIT and TE.Analytic and empirical measurements of (a) Transfer entropy and (b) Momentary information transfer as a function of memory noise parameter for the discrete-valued process with short-term source memory and a delay . Each measure is plotted for delays (red) and 2 (green). The correct causal interaction delay coorsponds and therefore we expect an appropriate measure to always return a higher value with than with , i.e the red curve should always be at higher values than the green curve. Nevertheless, there is potential for to be identified erroneously as the delay due to the presence of memory in the source , and MIT indeed finds this result for a range of the memory noise parameter (below .1).

Mentions: Figure 3 shows the results of measuring and , with delays and 2, as a function of the source noise parameter . We see that, in line with our earlier proof regarding this situation of unidirectional coupling, consistently identifies the correct delay , since for all . On the other hand, for a significant range of , is deceived by the source memory into incorrectly identifying as the relevant delay.


Measuring information-transfer delays.

Wibral M, Pampu N, Priesemann V, Siebenhühner F, Seiwert H, Lindner M, Lizier JT, Vicente R - PLoS ONE (2013)

Test case (Ia), comparison of MIT and TE.Analytic and empirical measurements of (a) Transfer entropy  and (b) Momentary information transfer  as a function of memory noise parameter  for the discrete-valued process with short-term source memory and a delay . Each measure is plotted for delays  (red) and 2 (green). The correct causal interaction delay coorsponds  and therefore we expect an appropriate measure to always return a higher value with  than with , i.e the red curve should always be at higher values than the green curve. Nevertheless, there is potential for  to be identified erroneously as the delay due to the presence of memory in the source , and MIT indeed finds this result for a range of the memory noise parameter  (below .1).
© Copyright Policy
Related In: Results  -  Collection

Show All Figures
getmorefigures.php?uid=PMC3585400&req=5

pone-0055809-g003: Test case (Ia), comparison of MIT and TE.Analytic and empirical measurements of (a) Transfer entropy and (b) Momentary information transfer as a function of memory noise parameter for the discrete-valued process with short-term source memory and a delay . Each measure is plotted for delays (red) and 2 (green). The correct causal interaction delay coorsponds and therefore we expect an appropriate measure to always return a higher value with than with , i.e the red curve should always be at higher values than the green curve. Nevertheless, there is potential for to be identified erroneously as the delay due to the presence of memory in the source , and MIT indeed finds this result for a range of the memory noise parameter (below .1).
Mentions: Figure 3 shows the results of measuring and , with delays and 2, as a function of the source noise parameter . We see that, in line with our earlier proof regarding this situation of unidirectional coupling, consistently identifies the correct delay , since for all . On the other hand, for a significant range of , is deceived by the source memory into incorrectly identifying as the relevant delay.

Bottom Line: In complex networks such as gene networks, traffic systems or brain circuits it is important to understand how long it takes for the different parts of the network to effectively influence one another.We also show the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops.While evaluated on neuroscience data, we expect the estimator to be useful in other fields dealing with network dynamics.

View Article: PubMed Central - PubMed

Affiliation: MEG Unit, Brain Imaging Center, Goethe University, Frankfurt, Germany. wibral@em.uni-frankfurt.de

ABSTRACT
In complex networks such as gene networks, traffic systems or brain circuits it is important to understand how long it takes for the different parts of the network to effectively influence one another. In the brain, for example, axonal delays between brain areas can amount to several tens of milliseconds, adding an intrinsic component to any timing-based processing of information. Inferring neural interaction delays is thus needed to interpret the information transfer revealed by any analysis of directed interactions across brain structures. However, a robust estimation of interaction delays from neural activity faces several challenges if modeling assumptions on interaction mechanisms are wrong or cannot be made. Here, we propose a robust estimator for neuronal interaction delays rooted in an information-theoretic framework, which allows a model-free exploration of interactions. In particular, we extend transfer entropy to account for delayed source-target interactions, while crucially retaining the conditioning on the embedded target state at the immediately previous time step. We prove that this particular extension is indeed guaranteed to identify interaction delays between two coupled systems and is the only relevant option in keeping with Wiener's principle of causality. We demonstrate the performance of our approach in detecting interaction delays on finite data by numerical simulations of stochastic and deterministic processes, as well as on local field potential recordings. We also show the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops. While evaluated on neuroscience data, we expect the estimator to be useful in other fields dealing with network dynamics.

Show MeSH
Related in: MedlinePlus