Limits...
Designing Artificial Neural Networks Using Particle Swarm Optimization Algorithms.

Garro BA, Vázquez RA - Comput Intell Neurosci (2015)

Bottom Line: These functions are based on the mean square error (MSE) and the classification error (CER) and implement a strategy to avoid overtraining and to reduce the number of connections in the ANN.In addition, the ANN designed with the proposed methodology is compared with those designed manually using the well-known Back-Propagation and Levenberg-Marquardt Learning Algorithms.Finally, the accuracy of the method is tested with different nonlinear pattern classification problems.

View Article: PubMed Central - PubMed

Affiliation: Instituto en Investigaciones en Matemáticas Aplicadas y en Sistemas, Universidad Nacional Autónoma de México, Ciudad Universitaria, 04510 Mexico City, DF, Mexico.

ABSTRACT
Artificial Neural Network (ANN) design is a complex task because its performance depends on the architecture, the selected transfer function, and the learning algorithm used to train the set of synaptic weights. In this paper we present a methodology that automatically designs an ANN using particle swarm optimization algorithms such as Basic Particle Swarm Optimization (PSO), Second Generation of Particle Swarm Optimization (SGPSO), and a New Model of PSO called NMPSO. The aim of these algorithms is to evolve, at the same time, the three principal components of an ANN: the set of synaptic weights, the connections or architecture, and the transfer functions for each neuron. Eight different fitness functions were proposed to evaluate the fitness of each solution and find the best design. These functions are based on the mean square error (MSE) and the classification error (CER) and implement a strategy to avoid overtraining and to reduce the number of connections in the ANN. In addition, the ANN designed with the proposed methodology is compared with those designed manually using the well-known Back-Propagation and Levenberg-Marquardt Learning Algorithms. Finally, the accuracy of the method is tested with different nonlinear pattern classification problems.

No MeSH data available.


Related in: MedlinePlus

New Model of PSO pseudocode.
© Copyright Policy - open-access
Related In: Results  -  Collection


getmorefigures.php?uid=PMC4499655&req=5

alg1: New Model of PSO pseudocode.

Mentions: The NMPSO combines the varying schemes of inertia weight ω and acceleration coefficients c1 and c1, velocity resetting, crossover and mutation operators, and dynamic random neighbourhoods [13]. The NMPSO algorithm is described in Algorithm 1.


Designing Artificial Neural Networks Using Particle Swarm Optimization Algorithms.

Garro BA, Vázquez RA - Comput Intell Neurosci (2015)

New Model of PSO pseudocode.
© Copyright Policy - open-access
Related In: Results  -  Collection

Show All Figures
getmorefigures.php?uid=PMC4499655&req=5

alg1: New Model of PSO pseudocode.
Mentions: The NMPSO combines the varying schemes of inertia weight ω and acceleration coefficients c1 and c1, velocity resetting, crossover and mutation operators, and dynamic random neighbourhoods [13]. The NMPSO algorithm is described in Algorithm 1.

Bottom Line: These functions are based on the mean square error (MSE) and the classification error (CER) and implement a strategy to avoid overtraining and to reduce the number of connections in the ANN.In addition, the ANN designed with the proposed methodology is compared with those designed manually using the well-known Back-Propagation and Levenberg-Marquardt Learning Algorithms.Finally, the accuracy of the method is tested with different nonlinear pattern classification problems.

View Article: PubMed Central - PubMed

Affiliation: Instituto en Investigaciones en Matemáticas Aplicadas y en Sistemas, Universidad Nacional Autónoma de México, Ciudad Universitaria, 04510 Mexico City, DF, Mexico.

ABSTRACT
Artificial Neural Network (ANN) design is a complex task because its performance depends on the architecture, the selected transfer function, and the learning algorithm used to train the set of synaptic weights. In this paper we present a methodology that automatically designs an ANN using particle swarm optimization algorithms such as Basic Particle Swarm Optimization (PSO), Second Generation of Particle Swarm Optimization (SGPSO), and a New Model of PSO called NMPSO. The aim of these algorithms is to evolve, at the same time, the three principal components of an ANN: the set of synaptic weights, the connections or architecture, and the transfer functions for each neuron. Eight different fitness functions were proposed to evaluate the fitness of each solution and find the best design. These functions are based on the mean square error (MSE) and the classification error (CER) and implement a strategy to avoid overtraining and to reduce the number of connections in the ANN. In addition, the ANN designed with the proposed methodology is compared with those designed manually using the well-known Back-Propagation and Levenberg-Marquardt Learning Algorithms. Finally, the accuracy of the method is tested with different nonlinear pattern classification problems.

No MeSH data available.


Related in: MedlinePlus