Limits...
The q-G method : A q-version of the Steepest Descent method for global optimization.

Soterroni AC, Galski RL, Scarabello MC, Ramos FM - Springerplus (2015)

Bottom Line: The q-gradient vector, or simply the q-gradient, is a generalization of the classical gradient vector based on the concept of Jackson's derivative from the q-calculus.Its use provides the algorithm an effective mechanism for escaping from local minima.The algorithm has three free parameters and it is implemented so that the search process gradually shifts from global exploration in the beginning to local exploitation in the end.

View Article: PubMed Central - PubMed

Affiliation: Laboratory of Computing and Applied Mathematics, National Institute for Space Research, São José dos Campos, Brazil.

ABSTRACT
In this work, the q-Gradient (q-G) method, a q-version of the Steepest Descent method, is presented. The main idea behind the q-G method is the use of the negative of the q-gradient vector of the objective function as the search direction. The q-gradient vector, or simply the q-gradient, is a generalization of the classical gradient vector based on the concept of Jackson's derivative from the q-calculus. Its use provides the algorithm an effective mechanism for escaping from local minima. The q-G method reduces to the Steepest Descent method when the parameter q tends to 1. The algorithm has three free parameters and it is implemented so that the search process gradually shifts from global exploration in the beginning to local exploitation in the end. We evaluated the q-G method on 34 test functions, and compared its performance with 34 optimization algorithms, including derivative-free algorithms and the Steepest Descent method. Our results show that the q-G method is competitive and has a great potential for solving multimodal optimization problems.

No MeSH data available.


Behavior of the q-G method for different values of the parameters  () along the iterations k: a (local search) and b (global search)
© Copyright Policy - OpenAccess
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4628006&req=5

Fig3: Behavior of the q-G method for different values of the parameters () along the iterations k: a (local search) and b (global search)

Mentions: Figure 3 illustrates the points sampled by the q-G method, with two different strategies to generate the parameter q, over a multimodal function , . The initial point of the search is and the strategy adopted to define the step length is the same in both cases. Note that the function f has a local minimum at and a global minimum at . In Fig. 3a, the parameter is fixed and close to 1 along all the iterations k (, ) and, consequently, the q-G method performs a local search converging to the nearest local minimum to the starting point. In Fig. 3b, the parameters are different from 1 in the beginning and tend to 1 along the iterative procedure (, ), with the q-G method performing a global search, with exploration of the search space and exploitation of the global minimum vicinity. As can be seen in Fig. 3, the possibility of having different values of the parameter , and not only values close to 1, enables the method to move against the local descent direction, escape from the local minimum and sample other regions of the search space. In other words, when the values of are sufficiently different from 1, the q-gradient vector can point to any region of the search space. This potentially allows the q-G method to move towards the global minimum.Fig. 3


The q-G method : A q-version of the Steepest Descent method for global optimization.

Soterroni AC, Galski RL, Scarabello MC, Ramos FM - Springerplus (2015)

Behavior of the q-G method for different values of the parameters  () along the iterations k: a (local search) and b (global search)
© Copyright Policy - OpenAccess
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4628006&req=5

Fig3: Behavior of the q-G method for different values of the parameters () along the iterations k: a (local search) and b (global search)
Mentions: Figure 3 illustrates the points sampled by the q-G method, with two different strategies to generate the parameter q, over a multimodal function , . The initial point of the search is and the strategy adopted to define the step length is the same in both cases. Note that the function f has a local minimum at and a global minimum at . In Fig. 3a, the parameter is fixed and close to 1 along all the iterations k (, ) and, consequently, the q-G method performs a local search converging to the nearest local minimum to the starting point. In Fig. 3b, the parameters are different from 1 in the beginning and tend to 1 along the iterative procedure (, ), with the q-G method performing a global search, with exploration of the search space and exploitation of the global minimum vicinity. As can be seen in Fig. 3, the possibility of having different values of the parameter , and not only values close to 1, enables the method to move against the local descent direction, escape from the local minimum and sample other regions of the search space. In other words, when the values of are sufficiently different from 1, the q-gradient vector can point to any region of the search space. This potentially allows the q-G method to move towards the global minimum.Fig. 3

Bottom Line: The q-gradient vector, or simply the q-gradient, is a generalization of the classical gradient vector based on the concept of Jackson's derivative from the q-calculus.Its use provides the algorithm an effective mechanism for escaping from local minima.The algorithm has three free parameters and it is implemented so that the search process gradually shifts from global exploration in the beginning to local exploitation in the end.

View Article: PubMed Central - PubMed

Affiliation: Laboratory of Computing and Applied Mathematics, National Institute for Space Research, São José dos Campos, Brazil.

ABSTRACT
In this work, the q-Gradient (q-G) method, a q-version of the Steepest Descent method, is presented. The main idea behind the q-G method is the use of the negative of the q-gradient vector of the objective function as the search direction. The q-gradient vector, or simply the q-gradient, is a generalization of the classical gradient vector based on the concept of Jackson's derivative from the q-calculus. Its use provides the algorithm an effective mechanism for escaping from local minima. The q-G method reduces to the Steepest Descent method when the parameter q tends to 1. The algorithm has three free parameters and it is implemented so that the search process gradually shifts from global exploration in the beginning to local exploitation in the end. We evaluated the q-G method on 34 test functions, and compared its performance with 34 optimization algorithms, including derivative-free algorithms and the Steepest Descent method. Our results show that the q-G method is competitive and has a great potential for solving multimodal optimization problems.

No MeSH data available.