optimization-0.1.9: Numerical optimization

Safe HaskellNone
LanguageHaskell2010

Optimization.LineSearch.ConjugateGradient

Contents

Synopsis

Conjugate gradient methods

conjGrad Source #

Arguments

:: (Num a, RealFloat a, Additive f, Metric f) 
=> LineSearch f a

line search method

-> Beta f a

beta expression

-> (f a -> f a)

gradient of function

-> f a

starting point

-> [f a]

iterates

Conjugate gradient method with given beta and line search method

The conjugate gradient method avoids the trouble encountered by the steepest descent method on poorly conditioned problems (e.g. those with a wide range of eigenvalues). It does this by choosing directions which satisfy a condition of A orthogonality, ensuring that steps in the "unstretched" search space are orthogonal.

Step size methods

Beta expressions

type Beta f a = f a -> f a -> f a -> a Source #

A beta expression beta df0 df1 p is an expression for the conjugate direction contribution given the derivative df0 and direction p for iteration k, df1 for iteration k+1

fletcherReeves :: (Num a, RealFloat a, Metric f) => Beta f a Source #

Fletcher-Reeves expression for beta

polakRibiere :: (Num a, RealFloat a, Metric f) => Beta f a Source #

Polak-Ribiere expression for beta

hestenesStiefel :: (Num a, RealFloat a, Metric f) => Beta f a Source #

Hestenes-Stiefel expression for beta