An Extension Algorithm For Solve Multidimensional Unconstrained Optimization Problems

  • Mohammed Shakir Mahdi ZABIBA Kufa University
  • Nofl Sh Al-Shimari
  • Ahmed Abdulhussein Jabbar

Abstract

This study presents an extension of the Golden Section Search (GSS) algorithm to solve multidimensional unconstrained optimization problems without relying on derivative information. The proposed method adapts the classical GSS—a univariate optimization technique—to higher dimensions by integrating a trust-region framework, enabling efficient exploration of the search space. The algorithm iteratively evaluates trial points along coordinate directions, dynamically adjusts the trust-region radius based on improvement ratios, and applies 1D GSS along promising directions to locate local optima. Numerical experiments demonstrate its effectiveness in minimizing functions with one, two, three, and n variables, achieving convergence with fewer iterations compared to traditional methods. For instance, in a 3D test case, the algorithm converged to a solution within 20 iterations, while a 2D problem required 16 iterations. The method eliminates larger portions of the feasible region per iteration, enhancing computational efficiency, particularly for problems with flat or shallow optima. By transforming the feasible region into a hypercube and leveraging linear transformations, the algorithm generalizes seamlessly to n -dimensional spaces. This gradient-free approach is particularly advantageous for non-differentiable or complex functions, offering robustness and speed. The results highlight its potential for applications in engineering, data science, and other fields requiring rapid optimization of high-dimensional, derivative-free problems. All implementations were validated using Python, underscoring the algorithm’s practical accessibility.

Downloads

Download data is not yet available.
Published
2026-04-13
Section
Conf. Issue: Advances in Algebra, Analysis, Optimization, and Modeling