Global Convergence of a Curvilinear Search for Non-Convex Optimization

Michael Bartholomew-Biggs, Salah Beddiaf, Bruce Christianson

Research output: Contribution to journalArticlepeer-review

9 Downloads (Pure)

Abstract

For a non-convex function f : R^n → R with gradient g and Hessian H, define a step vector p(μ,x) as a function of scalar parameter μ and position vector x by the equation (H(x) + μI)p(μ, x) = −g(x).
Under mild conditions on f, we construct criteria for selecting μ so as to ensure that the algorithm x := x + p(μ, x) descends to a second order stationary point of f, and avoids saddle points.
Original languageEnglish
JournalNumerical Algorithms
DOIs
Publication statusPublished - 26 Aug 2022

Keywords

  • Nonlinear optimization; Newton-like methods; Non-convex functions

Fingerprint

Dive into the research topics of 'Global Convergence of a Curvilinear Search for Non-Convex Optimization'. Together they form a unique fingerprint.

Cite this