Evaluating and Generalization of Methods of Cyclic Coordinate, Hooke – Jeeves, and Rosenbrock

In optimization problems, we are looking for best point for our problem. Problems always classify to unary or n-ary with bound or unbound. In this paper, we evaluate three methods, Cyclic Coordinate, Hooke – Jeeves, and Rosenbrock on n dimensional space for unbounded problem and we speak about the advantages and disadvantages of these three methods. We offer solutions for some disadvantages and we give generalized algorithms to improve the methods.


Introduction
In optimization problems, we are looking for best point for our problem. Problems always classify to unary or n-ary with bound or unbound. There are several methods for optimization problems, such that Cyclic Coordinate method, Hooke -Jeeves method, and Rosenbrock method on n dimensional space for unbounded problem.
Here, we consider the problem of minimizing a function f of several variables without using derivatives. The methods described here proceed in following manner. Given a vector x, a suitable direction d is first determined, and then f is minimized from x in the direction d by one of the techniques we will describe.
Direct search methods are suitable tools for maximizing or minimizing functions that are non-smooth and non-differentiable. In their comprehensive book about non-linear programming [7] mention three methods (multidimensional search methods without using differentiating): the cyclic coordinate method, the Hooke-Jeeves method, and the Rosenbrock method. The cyclic coordinate method alters the value of one decision variable at a time, i.e. coordinate axes as the search directions. The Hooke-Jeeves method uses two search modes: exploratory search in the direction of coordinate axes (one decision variable altered at a time) and pattern search in other directions (more than one decision variable altered simultaneously). The method of Rosenbrock tests m (m is the number of optimized decision variables) orthogonal search directions which are equal to coordinate axes only at the first iteration. Rosenbrock search is a numerical optimization algorithm applicable to optimization problems in which the objective function is inexpensive to compute and the derivative either does not exist or cannot be computed efficiently.
Another type of method, namely dynamic programming, has been used extensively in stand management optimization [8]. It differs from direct search methods so that, instead of seeking optimal values for continuous decision variables, it finds the optimal sequence of stand states defined by classified values of several growing stock characteristics (for instance, basal area, mean diameter, and stand age).
Direct search methods can be used in both even-aged and uneven-aged forestry [9]. Uneven-aged management can also be optimized using linear programming and a transition matrix model [10]. Of the direct search methods mentioned in [7] the Hooke-Jeeves algorithm has been used much in forestry, at least in Finland [11,12].
In this paper, we first introduce these three methods, and then we express the advantages and disadvantages them, and finally, we give generalized algorithms to improve them.

The Method of Cyclic Coordinate
In this method we move on Coordinate axes as much as λ until get the optimal point [1, 4, and 6]. x , let 1 1 x y = , let 1 = = j k and got to the main step.

Algorithm for Cyclic Coordinate Method
x y k , replace k by k+1 and repeat step 1.

Advantages of Cyclic Coordinate Method
1. Implementation of an algorithm is simple. 2. We can use this algorithm for functions that either is differentiable or are not differentiable.

Disadvantages of Cyclic Coordinate Method
1. When we use this method for differentiable functions, it will converge to a point with zero gradients. But when the function is not differentiable, the method may stall at a non-optimal point. 2. This method converts an n-ary function to a unary, and by using an algorithm for solving unary function gain an optimal point and set in the n-ary function. Then increase the overhead of the algorithm. 3. We solve a unary system in each step and should pay attention to the domain of unary system. If the domain does not have a minimum point then the system does not get a good solution. The simplex method is used to solve the second problem (It must be considered that the complexity of the simplex method is not better than this)

Problem 3
Consider the following problem that solve according to the Cyclic Coordinate method.
We use the Golden section method (see Appendix) for solving unary system. The method initial with a uncertainty interval and we know minimum point is in this range. At each step decrease the range until gets a good range. Now imagine we have several extremum points or don't have any minimum point in this range. Then we check this special state in following sample.
In first step unary function shows like this with the following values ( Figure 1):

= =
In second step unary function shows like this with the following values ( Figure 2): Now if run the algorithm with an initial range (-4, 4) for Golden section method [2], we get following data in table 1. We could find answers in iteration 4. But if we choice initial range (1, 6) for Golden section method then have following data in table 2.
As you can see table 2, not only doesn't close to minimum point but reach to a larger number at each step and get away from the optimal point. So in thirtieth iteration x is (33.51, 34.49) and there are many differences between this number and the number of the previous table. The algorithm doesn't halt at this point and we can continue for iteration 31 and so on.
If you look at the table 2, at each iteration we use the Golden section method, λ is 1.08612. In fact almost consider beginning of the range.

Modifications to Algorithm of Cyclic Coordinate Method (Solution for Problem 3)
For solving this problem, we should find critical points for the Golden section algorithm. We need to differentiate from function in order to find critical point, and examine the state of function in previous and next points of where the function has a zero gradient. So we can't implement this job for following reasons: 1. We're working on that function may not be differentiable. 2. In each step we should apply another algorithm in order to specify a critical area in the Golden section method. So this job increase running time of main algorithm and it may not be cost effective. 3. Differentiating from function is complex. 4. If we can differentiate from function, and find the minimum point when has a zero gradient, then we don't need trouble yourself and implement the algorithm because we have minimum point!!!!!

The Method of Hooke -Jeeves
In this method we move on Coordinate axes until get optimal point [5, 6, and 7].

Algorithm for Hooke -Jeeves Method
then stop, otherwise go to step 2.

Let
and find optimal solution of problem and consider this solution is λˆ . Let 1 , , replace k by 1 + k and repeat step 1. As mentioned in problem 1 from Cyclic Coordinate method, this method may stop at a hole that is a non-optimal point, because considered function is not differentiable at this point. We use acceleration step to solve this problem. In acceleration step, after each move on direction of coordination, we have a search on direction of 1 2 x x − . So this job speeds up reaching to optimal solution. Look at figure 3.

Advantages
We haven't placed in the trap of hole by introducing acceleration step.

1.
As you see, we also search in the direction of 1 2 x x − in order to speed up the algorithm in step 2. But we don't examine halting condition when move in this direction. Maybe the algorithm has the necessary condition for halting when move in this direction, while we run algorithms for the next iteration. This job increase running time of the algorithm. 2. We always search in the direction of the Coordinate directions in Cyclic Coordinate and Hooke -Jeeves algorithms. Now do speed up algorithms if we do searching on other vectors? 3. In this method, we convert an n-ary function to an unary function and by algorithms for unary system find optimal point and set this optimal point an n-ary system. So this job increases running time of algorithm. 4. We should note that to the domain of unary system when we want to solve this system. If minimum is not in this domain, we don't have a proper solution.

Checking Problems of Hooke -Jeeves Algorithm
We examine Hooke -Jeeves algorithm for following example.
From Golden section method used for solving unary system. Initial uncertainty interval for the inner Golden section method [2] is (-4, 4).

Generalized Algorithm for Hooke -Jeeves Method (Algorithm 3)
In previous algorithms, we move along Coordinate directions in order to reach an optimal point. Now do we reach to an optimal solution if we move along other linear independent vectors?
According to considered example on space 2 ℜ we want to move in the direction of orthogonal and linear independent vectors (1,2) and (-2,1) in order to reach an optimal point. Now we run the algorithm in the direction of these two vectors:   1.82, 0.93). This point has noticeable difference with considered optimal point means (2, 1). Also this point is not better than other points from past algorithms.

The Method of Rosenbrock
In this method we search to in direction of n linearly independent and orthogonal vectors in each iteration. When we reach to a new point on each iteration, we construct a new set of orthogonal vectors [3,6].

Running Rosenbrock Algorithm
We examine Rosenbrock algorithm for following example We use a Golden section method for solving unary system. Initial interval of uncertainty for inner golden section method [2] is (-4,4).
Data for this example are in table 6. Note: direction vectors in Rosenbrock method that move along optimal point must have been linear independent. Because if optimization do in n dimensions space, we want to move to optima point along each dimension. Now if one of pair vectors is linear dependent, then we may have not move to optimal point in some dimensions. Therefore we couldn't reach to optimal point truly.

Numerical Example for Each Three Methods and Comparing Them
In this section, we examine other problem for three methods, Cyclic Coordinate, Hooke -Jeeves and Rosenbrock. And contrast these three methods. Problem: We will use golden section method for solving unary system.
Initial interval of uncertainty for inner golden section method is (-0.5, 0.5).
Diagram of the problem is in figure 4: We can't move to an optimal solution, when we want to solve this problem by Rosenbrock method. And in each stage we take away from an answer.
For most of problems we have same complexity and same solution for each of these three methods, Cyclic Coordinate, Hooke -Jeeves and Rosenbrock. You should note that we use unary method for solving unary system in these algorithms, like Golden section method. But we need to an interval of uncertainty in unary problem, so minimum point is in this range. Most of time finding interval of uncertainty are hard, because we should know critical area of problem. So this case is impossible in non-differentiable problems.
Data by Cyclic Coordinate method are in table 7:

Conclusions
There are many algorithms for solving optimization problems, that they have the advantages and disadvantages. We can use the generalized algorithms and corrections of algorithms in this paper, to improve other algorithms. And we get a more accurate answer. And we can use these generalized methods for more complex functions.

Golden Section Method
The following is a summary of the Golden section method for minimizing a strictly quasiconvex function over  , and let ) )( 1 (

Let
and go to step 4. 4. Replace k by k+1 and go to step 1.

Programs for the Four Methods
We implement Golden section, Cyclic Coordinate, Hooke -Jeeves, Rosenbrock methods by C# programming language.