Uneingeschränkter Zugang

A comparison of the convergence rates of Hestenes’ conjugate Gram-Schmidt method without derivatives with other numerical optimization methods

   | 03. Juni 2024

Zitieren

This article describes an approach known as the conjugate Gram-Schmidt method for estimating gradients and Hessian using function evaluations and difference quotients, and uses the Gram-Schmidt conjugate direction algorithm to minimize functions and compares it to other techniques for solving ∇f = 0. Comparable minimization algorithms are also used to demonstrate convergence rates using quotient and root convergence factors, as described by Ortega and Rheinbolt to determine the optimal minimization technique to obtain similar results to the Newton method, between the Gram-Schmidt approach and other minimizing approaches. A survey of the existing literature in order to compare Hestenes’ Gram-Schmidt conjugate direction approach without derivative to other minimization methods is conducted and the further analytical and computational details are provided.

eISSN:
2956-7068
Sprache:
Englisch
Zeitrahmen der Veröffentlichung:
2 Hefte pro Jahr
Fachgebiete der Zeitschrift:
Informatik, andere, Technik, Einführungen und Gesamtdarstellungen, Mathematik, Allgemeines, Physik