Safe Haskell | None |
---|---|

Language | Haskell2010 |

- conjGrad :: (Num a, RealFloat a, Additive f, Metric f) => LineSearch f a -> Beta f a -> (f a -> f a) -> f a -> [f a]
- module Optimization.LineSearch
- type Beta f a = f a -> f a -> f a -> a
- fletcherReeves :: (Num a, RealFloat a, Metric f) => Beta f a
- polakRibiere :: (Num a, RealFloat a, Metric f) => Beta f a
- hestenesStiefel :: (Num a, RealFloat a, Metric f) => Beta f a

# Conjugate gradient methods

:: (Num a, RealFloat a, Additive f, Metric f) | |

=> LineSearch f a | line search method |

-> Beta f a | beta expression |

-> (f a -> f a) | gradient of function |

-> f a | starting point |

-> [f a] | iterates |

Conjugate gradient method with given beta and line search method

The conjugate gradient method avoids the trouble encountered by the
steepest descent method on poorly conditioned problems (e.g. those with
a wide range of eigenvalues). It does this by choosing directions which
satisfy a condition of `A`

orthogonality, ensuring that steps in the
"unstretched" search space are orthogonal.
TODO: clarify explanation

# Step size methods

module Optimization.LineSearch

# Beta expressions

type Beta f a = f a -> f a -> f a -> a Source

A beta expression `beta df0 df1 p`

is an expression for the
conjugate direction contribution given the derivative `df0`

and
direction `p`

for iteration `k`

, `df1`

for iteration `k+1`