optimization-0.1.3: Numerical optimization

Safe HaskellSafe-Inferred

Optimization.LineSearch.ConjugateGradient

Contents

Synopsis

Conjugate gradient methods

conjGradSource

Arguments

:: (Num a, RealFloat a, Additive f, Metric f) 
=> LineSearch f a

line search method

-> Beta f a

beta expression

-> (f a -> f a)

gradient of function

-> f a

starting point

-> [f a]

iterates

Conjugate gradient method with given beta and line search method

The conjugate gradient method avoids the trouble encountered by the steepest descent method on poorly conditioned problems (e.g. those with a wide range of eigenvalues). It does this by choosing directions which satisfy a condition of A orthogonality, ensuring that steps in the unstretched search space are orthogonal. TODO: clarify explanation

Step size methods

Beta expressions

type Beta f a = f a -> f a -> f a -> aSource

A beta expression beta df0 df1 p is an expression for the conjugate direction contribution given the derivative df0 and direction p for iteration k, df1 for iteration k+1

fletcherReeves :: (Num a, RealFloat a, Metric f) => Beta f aSource

Fletcher-Reeves expression for beta

polakRibiere :: (Num a, RealFloat a, Metric f) => Beta f aSource

Polak-Ribiere expression for beta

hestenesStiefel :: (Num a, RealFloat a, Metric f) => Beta f aSource

Hestenes-Stiefel expression for beta