Limits...
A novel multiobjective evolutionary algorithm based on regression analysis.

Song Z, Wang M, Dai G, Vasile M - ScientificWorldJournal (2015)

Bottom Line: The result shows that MMEA-RA outperforms RM-MEDA and NSGA-II on the test instances with variable linkages.At the same time, MMEA-RA has higher efficiency than the other two algorithms.A few shortcomings of MMEA-RA have also been identified and discussed in this paper.

View Article: PubMed Central - PubMed

Affiliation: School of Computer, China University of Geosciences, Wuhan 430074, China.

ABSTRACT
As is known, the Pareto set of a continuous multiobjective optimization problem with m objective functions is a piecewise continuous (m - 1)-dimensional manifold in the decision space under some mild conditions. However, how to utilize the regularity to design multiobjective optimization algorithms has become the research focus. In this paper, based on this regularity, a model-based multiobjective evolutionary algorithm with regression analysis (MMEA-RA) is put forward to solve continuous multiobjective optimization problems with variable linkages. In the algorithm, the optimization problem is modelled as a promising area in the decision space by a probability distribution, and the centroid of the probability distribution is (m - 1)-dimensional piecewise continuous manifold. The least squares method is used to construct such a model. A selection strategy based on the nondominated sorting is used to choose the individuals to the next generation. The new algorithm is tested and compared with NSGA-II and RM-MEDA. The result shows that MMEA-RA outperforms RM-MEDA and NSGA-II on the test instances with variable linkages. At the same time, MMEA-RA has higher efficiency than the other two algorithms. A few shortcomings of MMEA-RA have also been identified and discussed in this paper.

No MeSH data available.


Related in: MedlinePlus

Four data points and two different curves.
© Copyright Policy - open-access
Related In: Results  -  Collection


getmorefigures.php?uid=PMC4385692&req=5

fig4: Four data points and two different curves.

Mentions: How can we choose the coefficients that best fit the curve to the data? The idea of least squares approach is to find a curve that gives minimum error between data y and the fitting curve f(x). As is shown in Figure 4, we can firstly add up the length of all the solid and dashed vertical lines and then pick curve with minimum total error. The general expression for any error using the least squares approach is (7)err⁡=∑i=1ndi2=y1−fx12+y2−fx22 +⋯+yn−fxn2.For expression (7), we want to minimize the error err. Replace f(x) in expression (7) with the expression (6), and then we have(8)err⁡=∑i=1nyi−∑k=0jakxik2,where n is the number of data points given, i is the current data points being summed, and j is the polynomial order. To find the best line means to minimize the square of the distance error between line and data points. Find the set of coefficients a0, a1,…,aj, that is to say, to minimize expression (8).


A novel multiobjective evolutionary algorithm based on regression analysis.

Song Z, Wang M, Dai G, Vasile M - ScientificWorldJournal (2015)

Four data points and two different curves.
© Copyright Policy - open-access
Related In: Results  -  Collection

Show All Figures
getmorefigures.php?uid=PMC4385692&req=5

fig4: Four data points and two different curves.
Mentions: How can we choose the coefficients that best fit the curve to the data? The idea of least squares approach is to find a curve that gives minimum error between data y and the fitting curve f(x). As is shown in Figure 4, we can firstly add up the length of all the solid and dashed vertical lines and then pick curve with minimum total error. The general expression for any error using the least squares approach is (7)err⁡=∑i=1ndi2=y1−fx12+y2−fx22 +⋯+yn−fxn2.For expression (7), we want to minimize the error err. Replace f(x) in expression (7) with the expression (6), and then we have(8)err⁡=∑i=1nyi−∑k=0jakxik2,where n is the number of data points given, i is the current data points being summed, and j is the polynomial order. To find the best line means to minimize the square of the distance error between line and data points. Find the set of coefficients a0, a1,…,aj, that is to say, to minimize expression (8).

Bottom Line: The result shows that MMEA-RA outperforms RM-MEDA and NSGA-II on the test instances with variable linkages.At the same time, MMEA-RA has higher efficiency than the other two algorithms.A few shortcomings of MMEA-RA have also been identified and discussed in this paper.

View Article: PubMed Central - PubMed

Affiliation: School of Computer, China University of Geosciences, Wuhan 430074, China.

ABSTRACT
As is known, the Pareto set of a continuous multiobjective optimization problem with m objective functions is a piecewise continuous (m - 1)-dimensional manifold in the decision space under some mild conditions. However, how to utilize the regularity to design multiobjective optimization algorithms has become the research focus. In this paper, based on this regularity, a model-based multiobjective evolutionary algorithm with regression analysis (MMEA-RA) is put forward to solve continuous multiobjective optimization problems with variable linkages. In the algorithm, the optimization problem is modelled as a promising area in the decision space by a probability distribution, and the centroid of the probability distribution is (m - 1)-dimensional piecewise continuous manifold. The least squares method is used to construct such a model. A selection strategy based on the nondominated sorting is used to choose the individuals to the next generation. The new algorithm is tested and compared with NSGA-II and RM-MEDA. The result shows that MMEA-RA outperforms RM-MEDA and NSGA-II on the test instances with variable linkages. At the same time, MMEA-RA has higher efficiency than the other two algorithms. A few shortcomings of MMEA-RA have also been identified and discussed in this paper.

No MeSH data available.


Related in: MedlinePlus