A New Paired Spectral Gradient Method to Improve Unconstrained and Non-Linear Optimization.

The conjugated spectral gradient (SCG) method is an effective method for non-constrained large-scale nonlinear optimization. In this work, a new spectral conjugate gradient method is proposed with a strong Wolfe-Powell line search (SWP). The new proposal is based on using the formula obtained by comparing the proposed algorithm with previously published conjugate gradient algorithms. Under the usual assumptions, the descent properties and overall global convergence of the proposed method are proved. The proposed method is numerically proven to be effective.


Introduction:
Conjugate gradient (CG) and SCG methods are the most effective categories for solving large-scale nonlinear unconstrained optimization problems, this is because they have the advantage of fast convergence, low storage and simple iterations [1] [2] [3].Now consider the nonlinear unconstrained optimization problems [4].
where f ∈ R n is a smooth function, a gradient vector is usually represented by: ▽ f (x) = g(x).The initial point x 0 ∈ R n is usually calculated through iterative process.The new point calculated as follows: The direction of the search is given in: from which we find: Where β k ∈ ℜ is the parameter, and α K > 0 is the generated by inexact line search (ILS).In this work, we use (SWP) defined by: It depends on finding the average for non-linear parameters.

New Algorithm and the Descent Property:
The SCG method is obtained by combining the CG search direction and a scalar spectral parameter.Relying on the kindness requirement proposed by the researchers Liu.Jinkui, Du.Xianglin, and Wang Kairong.[5] and [6].
The parameters proposed by Basim A. Hassan and Hameed M. Sadeq [10].
We substitute in the general form After compensation:

Algorithm of SCG:
Step 1: Choose an initial point, Step2:Test the convergence if ,∥g k ∥ ≤ ε, stop or go on.
Step 3: Use the equation: Step 4: Generate the new point with the equation: Calculate the gradient, g k+1 = g(x k+1 ), Test∥g k+1 ∥ ≤ ε Stop or go ahead Calculate the spectral parameter.
Calculate the parameter, Calculate the vector d k+1 from the equation:

The Proposed SCG:
After deriving the spectral conjugate gradient coefficient, θ New k We will check the sufficient slopes characteristic using Wolff's strong line of research [11] [10].

Assumption (A):
1-If f(x) is restricted from below on the levelse: 2− If f (x) is continuously differentiable in a certain neighborhood N of Ψ, and its gradient is Lipchitz continuous, i.e., there is a constant.

Theorem 1:
Suppose that assumption (A) holds.Assuming that the sequences g k and d k α k be generated by the algorithm SCG, and the step size is obtained by SWP.Then, the proposed method has sufficient descent direction [13].
Proof: We will use the property of mathematical induction when k = 0 then g T Thus, the relationship is true, and now let us suppose that the relationship is true for all values k ≥ 0 Multiply both sides of the equation 9 at g T k+1 we get.
We know that: And by substituting one direction from the return condition for Powell [14], which is as follows: Therefore, the proposed algorithm satisfies the sufficient descent condition with SWP conditions.
And using the number one hypothesis on the objective function f it's: And from the equation 30 We get where

The Global Convergence Property:
In this section, we will prove another important condition, called global convergence property.In the following lemma, we review the well-known Zoutendijk condition [15], which plays an important role in the proof of the global convergence analysis of SCG method [16], [17] Lemma [18]: Let assumption (A) holds.Suppose any iteration method 2 and 3, and α k is obtained by the SWP.If: We must prove that ∥d k+1 ∥tied from above and take ∥ • ∥ to both sides of the equation 9 We get: Kirkuk Univ.J. Sci.Stud.Vol. 18, Iss.2, p 24-31, 2023 compensate for (26),( 28) at (24) We get The new proposed algorithm has achieved global convergence.

Results and Discussion:
This algorithm was tested in practice using Fortran 7.7.The program has been tested and this algorithm is practical, using functions in unconstrained optimization.Nonlinear dimensions are used.Table 1 and Figures 1, 2 and 3 includes the numerical results of the algorithm, where the values of.were, p = 0.9.The comparison between the proposed algorithm and parameter conjugate gradient algorithms was the First original θ [5], the Second original θ [7], the third original θ [10], and The New θ of the Table 1 includes the numerical results of the second algorithm, where the values of.The comparison between the proposed algorithm and parameter conjugate gradient algorithms.The stop scale used for all message algorithms was: ∥g k+1 ∥ ≤ 10 −6 .Our average values have been taken (NOI), (NOF) And we symbolized it with the symbol (CPU).
All codes are written in double-precision FORTRAN 77 language and compiled into Visual (Fortran 6.6) (default compiler settings).Under Table 1, we have compiled the names of test functions used and the numerical results between the [5], [7] and [10] algorithm and the SCG algorithm.

Conclusion:
In this work, we prove the property of sufficient proportions and the property of universal convergence of the hadith.Coupled spectral gradient method proposed by strong Wolfe-Powell line search the number.The results show that the SCG algorithm is superior to the conjugated gradient method First original θ [5], Second originalθ [7], Third original θ [10] in terms of the number of Frequency and number Of function In this section, we will discuss how to calculate the percentage improvement percentage for the proposed algorithms compared.With the classic algorithms used in comparison: (Print), 2616-6801 (Online) Copyright © 2023, Kirkuk University-College of Science.This is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY 4.0) license (https://creativecommons.org/license/by/4.0/)

Theorem 2 :
Consider that assumption (A) is satisfied.The sequences x k and d k generated by the algorithm SCG, α k is obtained by SWP and d k is the descent direction.Then lim k→∞ in f ∥g k+1 ∥ = 0 Proof: Since the algorithm fulfills the condition of sufficient regression, and g k+1 ̸ = 0,[19] [20] Where (Z) represents the (NOI) with respect to (θ New ) and (P) represents the (NOI) with respect to one of the classic functions, and (D) represents the final percentage obtained and in the same way the percentages for improving the rest of the algorithms are obtained with the classical algorithms with respect to (NOI) (NOI).Table2showing the percentage improvement of the algorithm(θ New ) with the classical algorithms used for comparison.Table3 showing the NOF improvement percentage for the algorithm of θ New .Table4 showing the CPU improvement percentage for the algorithm of θ New .In this work, we prove the sufficient descent property and the global convergence property of the newly proposed spectral conjugate gradient method through a strong Wolfe-Powell line search.The numerical results show that the SCG algorithm outperforms[5],[7] and[10] conjugate gradient method in terms of the number of iterations and the number of function evaluations.

Table 1 .
A comparison has been made on the basis of the total number of repetitions, symbolized by (NOI), and the total number of computed functions, symbolized by (NOF) and the total number of computed time, symbolized by (CPU).

Table 2 .
NOI improvement percentage for the algorithm θ New .