Limits...
A Unified Framework for Reservoir Computing and Extreme Learning Machines based on a Single Time-delayed Neuron.

Ortín S, Soriano MC, Pesquera L, Brunner D, San-Martín D, Fischer I, Mirasso CR, Gutiérrez JM - Sci Rep (2015)

Bottom Line: The reservoir is built within the delay-line, employing a number of "virtual" neurons.One key advantage of this approach is that it can be implemented efficiently in hardware.We show that the reservoir computing implementation, in this case optoelectronic, is also capable to realize extreme learning machines, demonstrating the unified framework for both schemes in software as well as in hardware.

View Article: PubMed Central - PubMed

Affiliation: Instituto de Física de Cantabria, CSIC-Universidad de Cantabria, E-39005 Santander, Spain.

ABSTRACT
In this paper we present a unified framework for extreme learning machines and reservoir computing (echo state networks), which can be physically implemented using a single nonlinear neuron subject to delayed feedback. The reservoir is built within the delay-line, employing a number of "virtual" neurons. These virtual neurons receive random projections from the input layer containing the information to be processed. One key advantage of this approach is that it can be implemented efficiently in hardware. We show that the reservoir computing implementation, in this case optoelectronic, is also capable to realize extreme learning machines, demonstrating the unified framework for both schemes in software as well as in hardware.

No MeSH data available.


Memory function for the numerical (dashed lines) and experimental (solid lines) realizations of (a) ESN with d = 1 and (b) ELM with d = 3, 6 and 12, respectively.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4597340&req=5

f3: Memory function for the numerical (dashed lines) and experimental (solid lines) realizations of (a) ESN with d = 1 and (b) ELM with d = 3, 6 and 12, respectively.

Mentions: In order to compare the experimental and numerical results, we measure the signal to noise ratio of the experimental realizations and use the corresponding system and quantization noises in the numerical simulations. We find that the signal to noise ratio (SNR) of a single measurement is ~24 dB, although the experimental SNR can be increased to 40 dB by averaging the detection over ten repetitions of the measurement. Figure 3(a) shows the memory function, m(i) for the ESN implementation. We find excellent agreement between numerics and experiments. Experimental memory functions with SNR ≈40 dB and SNR ≈24 dB yield a memory capacity of MC = 8.5 and 6, respectively. Since noise degrades the memory capacity of the system, we also show the numerical results without noise, which yield a MC around 12. The memory capacity of the noise-free system is approximately twice the MC of the system with 24 dB noise.


A Unified Framework for Reservoir Computing and Extreme Learning Machines based on a Single Time-delayed Neuron.

Ortín S, Soriano MC, Pesquera L, Brunner D, San-Martín D, Fischer I, Mirasso CR, Gutiérrez JM - Sci Rep (2015)

Memory function for the numerical (dashed lines) and experimental (solid lines) realizations of (a) ESN with d = 1 and (b) ELM with d = 3, 6 and 12, respectively.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4597340&req=5

f3: Memory function for the numerical (dashed lines) and experimental (solid lines) realizations of (a) ESN with d = 1 and (b) ELM with d = 3, 6 and 12, respectively.
Mentions: In order to compare the experimental and numerical results, we measure the signal to noise ratio of the experimental realizations and use the corresponding system and quantization noises in the numerical simulations. We find that the signal to noise ratio (SNR) of a single measurement is ~24 dB, although the experimental SNR can be increased to 40 dB by averaging the detection over ten repetitions of the measurement. Figure 3(a) shows the memory function, m(i) for the ESN implementation. We find excellent agreement between numerics and experiments. Experimental memory functions with SNR ≈40 dB and SNR ≈24 dB yield a memory capacity of MC = 8.5 and 6, respectively. Since noise degrades the memory capacity of the system, we also show the numerical results without noise, which yield a MC around 12. The memory capacity of the noise-free system is approximately twice the MC of the system with 24 dB noise.

Bottom Line: The reservoir is built within the delay-line, employing a number of "virtual" neurons.One key advantage of this approach is that it can be implemented efficiently in hardware.We show that the reservoir computing implementation, in this case optoelectronic, is also capable to realize extreme learning machines, demonstrating the unified framework for both schemes in software as well as in hardware.

View Article: PubMed Central - PubMed

Affiliation: Instituto de Física de Cantabria, CSIC-Universidad de Cantabria, E-39005 Santander, Spain.

ABSTRACT
In this paper we present a unified framework for extreme learning machines and reservoir computing (echo state networks), which can be physically implemented using a single nonlinear neuron subject to delayed feedback. The reservoir is built within the delay-line, employing a number of "virtual" neurons. These virtual neurons receive random projections from the input layer containing the information to be processed. One key advantage of this approach is that it can be implemented efficiently in hardware. We show that the reservoir computing implementation, in this case optoelectronic, is also capable to realize extreme learning machines, demonstrating the unified framework for both schemes in software as well as in hardware.

No MeSH data available.