Limits...
The q-G method : A q-version of the Steepest Descent method for global optimization.

Soterroni AC, Galski RL, Scarabello MC, Ramos FM - Springerplus (2015)

Bottom Line: The q-G method reduces to the Steepest Descent method when the parameter q tends to 1.We evaluated the q-G method on 34 test functions, and compared its performance with 34 optimization algorithms, including derivative-free algorithms and the Steepest Descent method.Our results show that the q-G method is competitive and has a great potential for solving multimodal optimization problems.

View Article: PubMed Central - PubMed

Affiliation: Laboratory of Computing and Applied Mathematics, National Institute for Space Research, São José dos Campos, Brazil.

ABSTRACT
In this work, the q-Gradient (q-G) method, a q-version of the Steepest Descent method, is presented. The main idea behind the q-G method is the use of the negative of the q-gradient vector of the objective function as the search direction. The q-gradient vector, or simply the q-gradient, is a generalization of the classical gradient vector based on the concept of Jackson's derivative from the q-calculus. Its use provides the algorithm an effective mechanism for escaping from local minima. The q-G method reduces to the Steepest Descent method when the parameter q tends to 1. The algorithm has three free parameters and it is implemented so that the search process gradually shifts from global exploration in the beginning to local exploitation in the end. We evaluated the q-G method on 34 test functions, and compared its performance with 34 optimization algorithms, including derivative-free algorithms and the Steepest Descent method. Our results show that the q-G method is competitive and has a great potential for solving multimodal optimization problems.

No MeSH data available.


Geometric interpretation of the classical derivative (dotted line) and the q-derivative for different values of the parameter q
© Copyright Policy - OpenAccess
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4628006&req=5

Fig1: Geometric interpretation of the classical derivative (dotted line) and the q-derivative for different values of the parameter q

Mentions: The gradient of one-variable function f(x) is simply the derivative. Geometrically, it is the slope of the tangent line at a given point x; see Fig. 1. Similarly, the q-gradient of f is the q-derivative that has also a straightforward geometric interpretation as the slope of the secant line passing through the points [x, f(x)] and [qx, f(qx)]. It is immediately evident that the sign of the q-derivative can be either positive or negative, depending on the value of the parameter q. For and (Fig. 1), the sign of the q-derivative is positive and the q-G method will move to the left as the Steepest Descent method would do. However, for the sign of the q-derivative is negative which potentially allows the q-G method to move to the right direction, towards the global minimum of f.Fig. 1


The q-G method : A q-version of the Steepest Descent method for global optimization.

Soterroni AC, Galski RL, Scarabello MC, Ramos FM - Springerplus (2015)

Geometric interpretation of the classical derivative (dotted line) and the q-derivative for different values of the parameter q
© Copyright Policy - OpenAccess
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4628006&req=5

Fig1: Geometric interpretation of the classical derivative (dotted line) and the q-derivative for different values of the parameter q
Mentions: The gradient of one-variable function f(x) is simply the derivative. Geometrically, it is the slope of the tangent line at a given point x; see Fig. 1. Similarly, the q-gradient of f is the q-derivative that has also a straightforward geometric interpretation as the slope of the secant line passing through the points [x, f(x)] and [qx, f(qx)]. It is immediately evident that the sign of the q-derivative can be either positive or negative, depending on the value of the parameter q. For and (Fig. 1), the sign of the q-derivative is positive and the q-G method will move to the left as the Steepest Descent method would do. However, for the sign of the q-derivative is negative which potentially allows the q-G method to move to the right direction, towards the global minimum of f.Fig. 1

Bottom Line: The q-G method reduces to the Steepest Descent method when the parameter q tends to 1.We evaluated the q-G method on 34 test functions, and compared its performance with 34 optimization algorithms, including derivative-free algorithms and the Steepest Descent method.Our results show that the q-G method is competitive and has a great potential for solving multimodal optimization problems.

View Article: PubMed Central - PubMed

Affiliation: Laboratory of Computing and Applied Mathematics, National Institute for Space Research, São José dos Campos, Brazil.

ABSTRACT
In this work, the q-Gradient (q-G) method, a q-version of the Steepest Descent method, is presented. The main idea behind the q-G method is the use of the negative of the q-gradient vector of the objective function as the search direction. The q-gradient vector, or simply the q-gradient, is a generalization of the classical gradient vector based on the concept of Jackson's derivative from the q-calculus. Its use provides the algorithm an effective mechanism for escaping from local minima. The q-G method reduces to the Steepest Descent method when the parameter q tends to 1. The algorithm has three free parameters and it is implemented so that the search process gradually shifts from global exploration in the beginning to local exploitation in the end. We evaluated the q-G method on 34 test functions, and compared its performance with 34 optimization algorithms, including derivative-free algorithms and the Steepest Descent method. Our results show that the q-G method is competitive and has a great potential for solving multimodal optimization problems.

No MeSH data available.