ad-1.5.0.2: Automatic Differentiation

PortabilityGHC only
Stabilityexperimental
Maintainerekmett@gmail.com
Safe HaskellSafe-Infered

Numeric.AD.Newton

Contents

Description

 

Synopsis

Newton's Method (Forward AD)

findZero :: (Fractional a, Eq a) => (forall s. Mode s => AD s a -> AD s a) -> a -> [a]Source

The findZero function finds a zero of a scalar function using Newton's method; its output is a stream of increasingly accurate results. (Modulo the usual caveats.)

Examples:

 take 10 $ findZero (\\x->x^2-4) 1  -- converge to 2.0
 module Data.Complex
 take 10 $ findZero ((+1).(^2)) (1 :+ 1)  -- converge to (0 :+ 1)@

inverse :: (Fractional a, Eq a) => (forall s. Mode s => AD s a -> AD s a) -> a -> a -> [a]Source

The inverseNewton function inverts a scalar function using Newton's method; its output is a stream of increasingly accurate results. (Modulo the usual caveats.)

Example:

 take 10 $ inverseNewton sqrt 1 (sqrt 10)  -- converges to 10

fixedPoint :: (Fractional a, Eq a) => (forall s. Mode s => AD s a -> AD s a) -> a -> [a]Source

The fixedPoint function find a fixedpoint of a scalar function using Newton's method; its output is a stream of increasingly accurate results. (Modulo the usual caveats.)

 take 10 $ fixedPoint cos 1 -- converges to 0.7390851332151607

extremum :: (Fractional a, Eq a) => (forall s. Mode s => AD s a -> AD s a) -> a -> [a]Source

The extremum function finds an extremum of a scalar function using Newton's method; produces a stream of increasingly accurate results. (Modulo the usual caveats.)

 take 10 $ extremum cos 1 -- convert to 0

Gradient Ascent/Descent (Reverse AD)

gradientDescent :: (Traversable f, Fractional a, Ord a) => (forall s. Mode s => f (AD s a) -> AD s a) -> f a -> [f a]Source

The gradientDescent function performs a multivariate optimization, based on the naive-gradient-descent in the file stalingrad/examples/flow-tests/pre-saddle-1a.vlad from the VLAD compiler Stalingrad sources. Its output is a stream of increasingly accurate results. (Modulo the usual caveats.)

It uses reverse mode automatic differentiation to compute the gradient.

gradientAscent :: (Traversable f, Fractional a, Ord a) => (forall s. Mode s => f (AD s a) -> AD s a) -> f a -> [f a]Source