Limits...
When and How-Long: A Unified Approach for Time Perception.

Maniadakis M, Trahanias P - Front Psychol (2016)

Bottom Line: This information, although rather standard in humans, is largely missing from artificial cognitive systems.In this work we consider how a time perception model that is based on neural networks and the Striatal Beat Frequency (SBF) theory is extended in a way that besides the duration of events, facilitates the encoding of the time of occurrence in memory.The extended model is capable to support skills assumed in temporal cognition and answer time-related questions about the unfolded events.

View Article: PubMed Central - PubMed

Affiliation: Computational Vision and Robotics Laboratory, Institute of Computer Science, Foundation for Research and Technology Hellas Heraklion, Greece.

ABSTRACT
The representation of the environment assumes the encoding of four basic dimensions in the brain, that is the 3D space and time. The vital role of time for cognition is a topic that recently attracted increasing research interest. Surprisingly, the scientific community investigating mind-time interactions has mainly focused on interval timing, paying less attention on the encoding and processing of distant moments. The present work highlights two basic capacities that are necessary for developing temporal cognition in artificial systems. In particular, the seamless integration of agents in the environment assumes they are able to consider when events have occurred and how-long they have lasted. This information, although rather standard in humans, is largely missing from artificial cognitive systems. In this work we consider how a time perception model that is based on neural networks and the Striatal Beat Frequency (SBF) theory is extended in a way that besides the duration of events, facilitates the encoding of the time of occurrence in memory. The extended model is capable to support skills assumed in temporal cognition and answer time-related questions about the unfolded events.

No MeSH data available.


A summary of the internal dynamics (neural activations) developed in the model over time. Panel (A) shows neural activation in tSen1, the first receiving component of the recurrent TimeSense module. Panel (B) shows neural activation in tSen2, the second output component of the TimeSense module. Panel (C) shows neural activation in the t-Duration module which supports interval timing. Arrows indicate the times of event experiencing. The width of peaks is analogous to the duration of events, therefore enabling accurate duration estimation as shown in Figure 5A. Panel (D) shows neural activation in the t-Distance module which supports past perception. Neural activities shown in yellow and cyan implement internal time keeping of the elapsed time as illustrated by the log-shaped dotted black lines following their peaks.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4814468&req=5

Figure 6: A summary of the internal dynamics (neural activations) developed in the model over time. Panel (A) shows neural activation in tSen1, the first receiving component of the recurrent TimeSense module. Panel (B) shows neural activation in tSen2, the second output component of the TimeSense module. Panel (C) shows neural activation in the t-Duration module which supports interval timing. Arrows indicate the times of event experiencing. The width of peaks is analogous to the duration of events, therefore enabling accurate duration estimation as shown in Figure 5A. Panel (D) shows neural activation in the t-Distance module which supports past perception. Neural activities shown in yellow and cyan implement internal time keeping of the elapsed time as illustrated by the log-shaped dotted black lines following their peaks.

Mentions: The development of temporal processing internally in the model is shown in Figures 6A–D. The four plots show neural activity in the t-Sen1, t-Sen2, t-Duration, and T-Distance modules for the whole period of perceiving the 6 events. In the first stage of processing (Figure 6A), neural activity is mainly directed by the input oscillatory signals. Subsequently (Figure 6B) oscillations are mixed to produce a complex temporally structured neural activity. The first event occurs approximately at the moment 150. It seems that this event triggers a more structured oscillation fusion in t-Sen2 resulting in neural activity that looks like oscillation multiplexing. While the present model was not implemented on the basis of integrating oscillations that correspond to the known brain rhythms (delta band to gamma band), our results show that the combination of input signals at different frequencies may significantly contribute in the sense of time as suggested also in (Kononowicz and van Wassenhove, 2016).


When and How-Long: A Unified Approach for Time Perception.

Maniadakis M, Trahanias P - Front Psychol (2016)

A summary of the internal dynamics (neural activations) developed in the model over time. Panel (A) shows neural activation in tSen1, the first receiving component of the recurrent TimeSense module. Panel (B) shows neural activation in tSen2, the second output component of the TimeSense module. Panel (C) shows neural activation in the t-Duration module which supports interval timing. Arrows indicate the times of event experiencing. The width of peaks is analogous to the duration of events, therefore enabling accurate duration estimation as shown in Figure 5A. Panel (D) shows neural activation in the t-Distance module which supports past perception. Neural activities shown in yellow and cyan implement internal time keeping of the elapsed time as illustrated by the log-shaped dotted black lines following their peaks.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4814468&req=5

Figure 6: A summary of the internal dynamics (neural activations) developed in the model over time. Panel (A) shows neural activation in tSen1, the first receiving component of the recurrent TimeSense module. Panel (B) shows neural activation in tSen2, the second output component of the TimeSense module. Panel (C) shows neural activation in the t-Duration module which supports interval timing. Arrows indicate the times of event experiencing. The width of peaks is analogous to the duration of events, therefore enabling accurate duration estimation as shown in Figure 5A. Panel (D) shows neural activation in the t-Distance module which supports past perception. Neural activities shown in yellow and cyan implement internal time keeping of the elapsed time as illustrated by the log-shaped dotted black lines following their peaks.
Mentions: The development of temporal processing internally in the model is shown in Figures 6A–D. The four plots show neural activity in the t-Sen1, t-Sen2, t-Duration, and T-Distance modules for the whole period of perceiving the 6 events. In the first stage of processing (Figure 6A), neural activity is mainly directed by the input oscillatory signals. Subsequently (Figure 6B) oscillations are mixed to produce a complex temporally structured neural activity. The first event occurs approximately at the moment 150. It seems that this event triggers a more structured oscillation fusion in t-Sen2 resulting in neural activity that looks like oscillation multiplexing. While the present model was not implemented on the basis of integrating oscillations that correspond to the known brain rhythms (delta band to gamma band), our results show that the combination of input signals at different frequencies may significantly contribute in the sense of time as suggested also in (Kononowicz and van Wassenhove, 2016).

Bottom Line: This information, although rather standard in humans, is largely missing from artificial cognitive systems.In this work we consider how a time perception model that is based on neural networks and the Striatal Beat Frequency (SBF) theory is extended in a way that besides the duration of events, facilitates the encoding of the time of occurrence in memory.The extended model is capable to support skills assumed in temporal cognition and answer time-related questions about the unfolded events.

View Article: PubMed Central - PubMed

Affiliation: Computational Vision and Robotics Laboratory, Institute of Computer Science, Foundation for Research and Technology Hellas Heraklion, Greece.

ABSTRACT
The representation of the environment assumes the encoding of four basic dimensions in the brain, that is the 3D space and time. The vital role of time for cognition is a topic that recently attracted increasing research interest. Surprisingly, the scientific community investigating mind-time interactions has mainly focused on interval timing, paying less attention on the encoding and processing of distant moments. The present work highlights two basic capacities that are necessary for developing temporal cognition in artificial systems. In particular, the seamless integration of agents in the environment assumes they are able to consider when events have occurred and how-long they have lasted. This information, although rather standard in humans, is largely missing from artificial cognitive systems. In this work we consider how a time perception model that is based on neural networks and the Striatal Beat Frequency (SBF) theory is extended in a way that besides the duration of events, facilitates the encoding of the time of occurrence in memory. The extended model is capable to support skills assumed in temporal cognition and answer time-related questions about the unfolded events.

No MeSH data available.