Limits...
Measuring information-transfer delays.

Wibral M, Pampu N, Priesemann V, Siebenhühner F, Seiwert H, Lindner M, Lizier JT, Vicente R - PLoS ONE (2013)

Bottom Line: In complex networks such as gene networks, traffic systems or brain circuits it is important to understand how long it takes for the different parts of the network to effectively influence one another.We also show the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops.While evaluated on neuroscience data, we expect the estimator to be useful in other fields dealing with network dynamics.

View Article: PubMed Central - PubMed

Affiliation: MEG Unit, Brain Imaging Center, Goethe University, Frankfurt, Germany. wibral@em.uni-frankfurt.de

ABSTRACT
In complex networks such as gene networks, traffic systems or brain circuits it is important to understand how long it takes for the different parts of the network to effectively influence one another. In the brain, for example, axonal delays between brain areas can amount to several tens of milliseconds, adding an intrinsic component to any timing-based processing of information. Inferring neural interaction delays is thus needed to interpret the information transfer revealed by any analysis of directed interactions across brain structures. However, a robust estimation of interaction delays from neural activity faces several challenges if modeling assumptions on interaction mechanisms are wrong or cannot be made. Here, we propose a robust estimator for neuronal interaction delays rooted in an information-theoretic framework, which allows a model-free exploration of interactions. In particular, we extend transfer entropy to account for delayed source-target interactions, while crucially retaining the conditioning on the embedded target state at the immediately previous time step. We prove that this particular extension is indeed guaranteed to identify interaction delays between two coupled systems and is the only relevant option in keeping with Wiener's principle of causality. We demonstrate the performance of our approach in detecting interaction delays on finite data by numerical simulations of stochastic and deterministic processes, as well as on local field potential recordings. We also show the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops. While evaluated on neuroscience data, we expect the estimator to be useful in other fields dealing with network dynamics.

Show MeSH

Related in: MedlinePlus

Test case (III).Transfer entropy () values and significance as a function of the assumed delay  for two unidirectionally coupled autoregressive systems with multiple delays. The simulated delays  were 15, 20, 25, 30 and 35 sampling points. The rest of the parameters and criteria used are the same as those in Figure 5.
© Copyright Policy
Related In: Results  -  Collection


getmorefigures.php?uid=PMC3585400&req=5

pone-0055809-g006: Test case (III).Transfer entropy () values and significance as a function of the assumed delay for two unidirectionally coupled autoregressive systems with multiple delays. The simulated delays were 15, 20, 25, 30 and 35 sampling points. The rest of the parameters and criteria used are the same as those in Figure 5.

Mentions: In the test case (III), we investigated two unidirectionally coupled AR processes where multiple interaction delays were present, . Figure 6 reveals that they can be readily detected by scanning . Well separated peaks indicate the presence of multiple delays around values of ∼ 14, 19, 25, and 30 sampling units for the direction of interaction to . The curve displays an additional shoulder at . Nominal delays in the simulations were 15, 20, 25, 30 and 35, and thus all but the longest delay were correctly detected. The longest delay is most likely not detected because much information from the relevant source state has already been communicated to the target over several shorter delays, due to the inherent memory of the AR(10) process, and there is no longer enough novel information provided by the source given the past state of the target to evoke a clear peak. However, the transfer entropy values indeed were statistically significant up to an assumed delay of 35 units, in line with the maximal delay simulated.


Measuring information-transfer delays.

Wibral M, Pampu N, Priesemann V, Siebenhühner F, Seiwert H, Lindner M, Lizier JT, Vicente R - PLoS ONE (2013)

Test case (III).Transfer entropy () values and significance as a function of the assumed delay  for two unidirectionally coupled autoregressive systems with multiple delays. The simulated delays  were 15, 20, 25, 30 and 35 sampling points. The rest of the parameters and criteria used are the same as those in Figure 5.
© Copyright Policy
Related In: Results  -  Collection

Show All Figures
getmorefigures.php?uid=PMC3585400&req=5

pone-0055809-g006: Test case (III).Transfer entropy () values and significance as a function of the assumed delay for two unidirectionally coupled autoregressive systems with multiple delays. The simulated delays were 15, 20, 25, 30 and 35 sampling points. The rest of the parameters and criteria used are the same as those in Figure 5.
Mentions: In the test case (III), we investigated two unidirectionally coupled AR processes where multiple interaction delays were present, . Figure 6 reveals that they can be readily detected by scanning . Well separated peaks indicate the presence of multiple delays around values of ∼ 14, 19, 25, and 30 sampling units for the direction of interaction to . The curve displays an additional shoulder at . Nominal delays in the simulations were 15, 20, 25, 30 and 35, and thus all but the longest delay were correctly detected. The longest delay is most likely not detected because much information from the relevant source state has already been communicated to the target over several shorter delays, due to the inherent memory of the AR(10) process, and there is no longer enough novel information provided by the source given the past state of the target to evoke a clear peak. However, the transfer entropy values indeed were statistically significant up to an assumed delay of 35 units, in line with the maximal delay simulated.

Bottom Line: In complex networks such as gene networks, traffic systems or brain circuits it is important to understand how long it takes for the different parts of the network to effectively influence one another.We also show the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops.While evaluated on neuroscience data, we expect the estimator to be useful in other fields dealing with network dynamics.

View Article: PubMed Central - PubMed

Affiliation: MEG Unit, Brain Imaging Center, Goethe University, Frankfurt, Germany. wibral@em.uni-frankfurt.de

ABSTRACT
In complex networks such as gene networks, traffic systems or brain circuits it is important to understand how long it takes for the different parts of the network to effectively influence one another. In the brain, for example, axonal delays between brain areas can amount to several tens of milliseconds, adding an intrinsic component to any timing-based processing of information. Inferring neural interaction delays is thus needed to interpret the information transfer revealed by any analysis of directed interactions across brain structures. However, a robust estimation of interaction delays from neural activity faces several challenges if modeling assumptions on interaction mechanisms are wrong or cannot be made. Here, we propose a robust estimator for neuronal interaction delays rooted in an information-theoretic framework, which allows a model-free exploration of interactions. In particular, we extend transfer entropy to account for delayed source-target interactions, while crucially retaining the conditioning on the embedded target state at the immediately previous time step. We prove that this particular extension is indeed guaranteed to identify interaction delays between two coupled systems and is the only relevant option in keeping with Wiener's principle of causality. We demonstrate the performance of our approach in detecting interaction delays on finite data by numerical simulations of stochastic and deterministic processes, as well as on local field potential recordings. We also show the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops. While evaluated on neuroscience data, we expect the estimator to be useful in other fields dealing with network dynamics.

Show MeSH
Related in: MedlinePlus