Limits...
Designing Artificial Neural Networks Using Particle Swarm Optimization Algorithms.

Garro BA, Vázquez RA - Comput Intell Neurosci (2015)

Bottom Line: These functions are based on the mean square error (MSE) and the classification error (CER) and implement a strategy to avoid overtraining and to reduce the number of connections in the ANN.In addition, the ANN designed with the proposed methodology is compared with those designed manually using the well-known Back-Propagation and Levenberg-Marquardt Learning Algorithms.Finally, the accuracy of the method is tested with different nonlinear pattern classification problems.

View Article: PubMed Central - PubMed

Affiliation: Instituto en Investigaciones en Matemáticas Aplicadas y en Sistemas, Universidad Nacional Autónoma de México, Ciudad Universitaria, 04510 Mexico City, DF, Mexico.

ABSTRACT
Artificial Neural Network (ANN) design is a complex task because its performance depends on the architecture, the selected transfer function, and the learning algorithm used to train the set of synaptic weights. In this paper we present a methodology that automatically designs an ANN using particle swarm optimization algorithms such as Basic Particle Swarm Optimization (PSO), Second Generation of Particle Swarm Optimization (SGPSO), and a New Model of PSO called NMPSO. The aim of these algorithms is to evolve, at the same time, the three principal components of an ANN: the set of synaptic weights, the connections or architecture, and the transfer functions for each neuron. Eight different fitness functions were proposed to evaluate the fitness of each solution and find the best design. These functions are based on the mean square error (MSE) and the classification error (CER) and implement a strategy to avoid overtraining and to reduce the number of connections in the ANN. In addition, the ANN designed with the proposed methodology is compared with those designed manually using the well-known Back-Propagation and Levenberg-Marquardt Learning Algorithms. Finally, the accuracy of the method is tested with different nonlinear pattern classification problems.

No MeSH data available.


Related in: MedlinePlus

Some ANN using NMPSO algorithm. (a) The best architecture for liver disorders problem. (b) The best architecture for object recognition problem. (c) The best architecture for wine problem. (d) The best architecture for glass problem.
© Copyright Policy - open-access
Related In: Results  -  Collection


getmorefigures.php?uid=PMC4499655&req=5

fig7: Some ANN using NMPSO algorithm. (a) The best architecture for liver disorders problem. (b) The best architecture for object recognition problem. (c) The best architecture for wine problem. (d) The best architecture for glass problem.

Mentions: Figure 7 shows some of the best ANNs generated with the NMPSO algorithm. The fitness function used with the NMPSO algorithm was CER function.


Designing Artificial Neural Networks Using Particle Swarm Optimization Algorithms.

Garro BA, Vázquez RA - Comput Intell Neurosci (2015)

Some ANN using NMPSO algorithm. (a) The best architecture for liver disorders problem. (b) The best architecture for object recognition problem. (c) The best architecture for wine problem. (d) The best architecture for glass problem.
© Copyright Policy - open-access
Related In: Results  -  Collection

Show All Figures
getmorefigures.php?uid=PMC4499655&req=5

fig7: Some ANN using NMPSO algorithm. (a) The best architecture for liver disorders problem. (b) The best architecture for object recognition problem. (c) The best architecture for wine problem. (d) The best architecture for glass problem.
Mentions: Figure 7 shows some of the best ANNs generated with the NMPSO algorithm. The fitness function used with the NMPSO algorithm was CER function.

Bottom Line: These functions are based on the mean square error (MSE) and the classification error (CER) and implement a strategy to avoid overtraining and to reduce the number of connections in the ANN.In addition, the ANN designed with the proposed methodology is compared with those designed manually using the well-known Back-Propagation and Levenberg-Marquardt Learning Algorithms.Finally, the accuracy of the method is tested with different nonlinear pattern classification problems.

View Article: PubMed Central - PubMed

Affiliation: Instituto en Investigaciones en Matemáticas Aplicadas y en Sistemas, Universidad Nacional Autónoma de México, Ciudad Universitaria, 04510 Mexico City, DF, Mexico.

ABSTRACT
Artificial Neural Network (ANN) design is a complex task because its performance depends on the architecture, the selected transfer function, and the learning algorithm used to train the set of synaptic weights. In this paper we present a methodology that automatically designs an ANN using particle swarm optimization algorithms such as Basic Particle Swarm Optimization (PSO), Second Generation of Particle Swarm Optimization (SGPSO), and a New Model of PSO called NMPSO. The aim of these algorithms is to evolve, at the same time, the three principal components of an ANN: the set of synaptic weights, the connections or architecture, and the transfer functions for each neuron. Eight different fitness functions were proposed to evaluate the fitness of each solution and find the best design. These functions are based on the mean square error (MSE) and the classification error (CER) and implement a strategy to avoid overtraining and to reduce the number of connections in the ANN. In addition, the ANN designed with the proposed methodology is compared with those designed manually using the well-known Back-Propagation and Levenberg-Marquardt Learning Algorithms. Finally, the accuracy of the method is tested with different nonlinear pattern classification problems.

No MeSH data available.


Related in: MedlinePlus