Limits...
The chronotron: a neuron that learns to fire temporally precise spike patterns.

Florian RV - PLoS ONE (2012)

Bottom Line: When the input is noisy, the classification also leads to noise reduction.The chronotrons can model neurons that encode information in the time of the first spike relative to the onset of salient stimuli or neurons in oscillatory networks that encode information in the phases of spikes relative to the background oscillation.Our results show that firing one spike per cycle optimizes memory capacity in neurons encoding information in the phase of firing relative to a background rhythm.

View Article: PubMed Central - PubMed

Affiliation: Center for Cognitive and Neural Studies, Romanian Institute of Science and Technology, Cluj-Napoca, Romania. florian@rist.ro

ABSTRACT
In many cases, neurons process information carried by the precise timings of spikes. Here we show how neurons can learn to generate specific temporally precise output spikes in response to input patterns of spikes having precise timings, thus processing and memorizing information that is entirely temporally coded, both as input and as output. We introduce two new supervised learning rules for spiking neurons with temporal coding of information (chronotrons), one that provides high memory capacity (E-learning), and one that has a higher biological plausibility (I-learning). With I-learning, the neuron learns to fire the target spike trains through synaptic changes that are proportional to the synaptic currents at the timings of real and target output spikes. We study these learning rules in computer simulations where we train integrate-and-fire neurons. Both learning rules allow neurons to fire at the desired timings, with sub-millisecond precision. We show how chronotrons can learn to classify their inputs, by firing identical, temporally precise spike trains for different inputs belonging to the same class. When the input is noisy, the classification also leads to noise reduction. We compute lower bounds for the memory capacity of chronotrons and explore the influence of various parameters on chronotrons' performance. The chronotrons can model neurons that encode information in the time of the first spike relative to the onset of salient stimuli or neurons in oscillatory networks that encode information in the phases of spikes relative to the background oscillation. Our results show that firing one spike per cycle optimizes memory capacity in neurons encoding information in the phase of firing relative to a background rhythm.

Show MeSH

Related in: MedlinePlus

The performance of learning rules when their parameters were optimized for fast learning for,  ().(A) The number of learning epochs required for correct learning as a function of the load , for . Correct learning was not achieved for I-learning and ReSuMe for  larger than 0.03. (B) The number of learning epochs required for correct learning as a function of the number of input synapses . Correct learning was not achieved for I-learning for  nor  larger than 6,000. Averages and standard deviations over 500 realizations. The arrows indicate the conditions for which the parameters were optimized.
© Copyright Policy
Related In: Results  -  Collection


getmorefigures.php?uid=PMC3412872&req=5

pone-0040233-g017: The performance of learning rules when their parameters were optimized for fast learning for, ().(A) The number of learning epochs required for correct learning as a function of the load , for . Correct learning was not achieved for I-learning and ReSuMe for larger than 0.03. (B) The number of learning epochs required for correct learning as a function of the number of input synapses . Correct learning was not achieved for I-learning for nor larger than 6,000. Averages and standard deviations over 500 realizations. The arrows indicate the conditions for which the parameters were optimized.

Mentions: In Fig. 17, parameters were optimized to lead to the minimum average number of learning epochs needed for correct learning for a setup with a relatively low load, . For the setup that was optimized and for the optimal parameters, ReSuMe had the fastest learning (16.757.43 epochs), followed by I-learning (23.396.87 epochs) and E-learning (36.487.61 epochs). However, the advantages of the first two learning rules over E-learning disappeared for setups with higher loads or higher number of input synapses than the optimized setup, when the other parameters were kept the same.


The chronotron: a neuron that learns to fire temporally precise spike patterns.

Florian RV - PLoS ONE (2012)

The performance of learning rules when their parameters were optimized for fast learning for,  ().(A) The number of learning epochs required for correct learning as a function of the load , for . Correct learning was not achieved for I-learning and ReSuMe for  larger than 0.03. (B) The number of learning epochs required for correct learning as a function of the number of input synapses . Correct learning was not achieved for I-learning for  nor  larger than 6,000. Averages and standard deviations over 500 realizations. The arrows indicate the conditions for which the parameters were optimized.
© Copyright Policy
Related In: Results  -  Collection

Show All Figures
getmorefigures.php?uid=PMC3412872&req=5

pone-0040233-g017: The performance of learning rules when their parameters were optimized for fast learning for, ().(A) The number of learning epochs required for correct learning as a function of the load , for . Correct learning was not achieved for I-learning and ReSuMe for larger than 0.03. (B) The number of learning epochs required for correct learning as a function of the number of input synapses . Correct learning was not achieved for I-learning for nor larger than 6,000. Averages and standard deviations over 500 realizations. The arrows indicate the conditions for which the parameters were optimized.
Mentions: In Fig. 17, parameters were optimized to lead to the minimum average number of learning epochs needed for correct learning for a setup with a relatively low load, . For the setup that was optimized and for the optimal parameters, ReSuMe had the fastest learning (16.757.43 epochs), followed by I-learning (23.396.87 epochs) and E-learning (36.487.61 epochs). However, the advantages of the first two learning rules over E-learning disappeared for setups with higher loads or higher number of input synapses than the optimized setup, when the other parameters were kept the same.

Bottom Line: When the input is noisy, the classification also leads to noise reduction.The chronotrons can model neurons that encode information in the time of the first spike relative to the onset of salient stimuli or neurons in oscillatory networks that encode information in the phases of spikes relative to the background oscillation.Our results show that firing one spike per cycle optimizes memory capacity in neurons encoding information in the phase of firing relative to a background rhythm.

View Article: PubMed Central - PubMed

Affiliation: Center for Cognitive and Neural Studies, Romanian Institute of Science and Technology, Cluj-Napoca, Romania. florian@rist.ro

ABSTRACT
In many cases, neurons process information carried by the precise timings of spikes. Here we show how neurons can learn to generate specific temporally precise output spikes in response to input patterns of spikes having precise timings, thus processing and memorizing information that is entirely temporally coded, both as input and as output. We introduce two new supervised learning rules for spiking neurons with temporal coding of information (chronotrons), one that provides high memory capacity (E-learning), and one that has a higher biological plausibility (I-learning). With I-learning, the neuron learns to fire the target spike trains through synaptic changes that are proportional to the synaptic currents at the timings of real and target output spikes. We study these learning rules in computer simulations where we train integrate-and-fire neurons. Both learning rules allow neurons to fire at the desired timings, with sub-millisecond precision. We show how chronotrons can learn to classify their inputs, by firing identical, temporally precise spike trains for different inputs belonging to the same class. When the input is noisy, the classification also leads to noise reduction. We compute lower bounds for the memory capacity of chronotrons and explore the influence of various parameters on chronotrons' performance. The chronotrons can model neurons that encode information in the time of the first spike relative to the onset of salient stimuli or neurons in oscillatory networks that encode information in the phases of spikes relative to the background oscillation. Our results show that firing one spike per cycle optimizes memory capacity in neurons encoding information in the phase of firing relative to a background rhythm.

Show MeSH
Related in: MedlinePlus