Accesso libero

A comparison of the convergence rates of Hestenes’ conjugate Gram-Schmidt method without derivatives with other numerical optimization methods

INFORMAZIONI SU QUESTO ARTICOLO

Cita

This article describes an approach known as the conjugate Gram-Schmidt method for estimating gradients and Hessian using function evaluations and difference quotients, and uses the Gram-Schmidt conjugate direction algorithm to minimize functions and compares it to other techniques for solving ∇f = 0. Comparable minimization algorithms are also used to demonstrate convergence rates using quotient and root convergence factors, as described by Ortega and Rheinbolt to determine the optimal minimization technique to obtain similar results to the Newton method, between the Gram-Schmidt approach and other minimizing approaches. A survey of the existing literature in order to compare Hestenes’ Gram-Schmidt conjugate direction approach without derivative to other minimization methods is conducted and the further analytical and computational details are provided.

eISSN:
2956-7068
Lingua:
Inglese
Frequenza di pubblicazione:
2 volte all'anno
Argomenti della rivista:
Computer Sciences, other, Engineering, Introductions and Overviews, Mathematics, General Mathematics, Physics