ad-3.1.2: Automatic Differentiation

PortabilityGHC only
Stabilityexperimental
Maintainerekmett@gmail.com
Safe HaskellNone

Numeric.AD.Newton

Contents

Description

 

Synopsis

Newton's Method (Forward AD)

findZero :: (Fractional a, Eq a) => (forall s. Mode s => AD s a -> AD s a) -> a -> [a]Source

The findZero function finds a zero of a scalar function using Newton's method; its output is a stream of increasingly accurate results. (Modulo the usual caveats.)

Examples:

>>> take 10 $ findZero (\x->x^2-4) 1
[1.0,2.5,2.05,2.000609756097561,2.0000000929222947,2.000000000000002,2.0]
>>> import Data.Complex
>>> last $ take 10 $ findZero ((+1).(^2)) (1 :+ 1)
0.0 :+ 1.0

inverse :: (Fractional a, Eq a) => (forall s. Mode s => AD s a -> AD s a) -> a -> a -> [a]Source

The inverse function inverts a scalar function using Newton's method; its output is a stream of increasingly accurate results. (Modulo the usual caveats.)

Example:

>>> last $ take 10 $ inverse sqrt 1 (sqrt 10)
10.0

fixedPoint :: (Fractional a, Eq a) => (forall s. Mode s => AD s a -> AD s a) -> a -> [a]Source

The fixedPoint function find a fixedpoint of a scalar function using Newton's method; its output is a stream of increasingly accurate results. (Modulo the usual caveats.)

>>> last $ take 10 $ fixedPoint cos 1
0.7390851332151607

extremum :: (Fractional a, Eq a) => (forall s. Mode s => AD s a -> AD s a) -> a -> [a]Source

The extremum function finds an extremum of a scalar function using Newton's method; produces a stream of increasingly accurate results. (Modulo the usual caveats.)

>>> last $ take 10 $ extremum cos 1
0.0

Gradient Ascent/Descent (Reverse AD)

gradientDescent :: (Traversable f, Fractional a, Ord a) => (forall s. Mode s => f (AD s a) -> AD s a) -> f a -> [f a]Source

The gradientDescent function performs a multivariate optimization, based on the naive-gradient-descent in the file stalingrad/examples/flow-tests/pre-saddle-1a.vlad from the VLAD compiler Stalingrad sources. Its output is a stream of increasingly accurate results. (Modulo the usual caveats.)

It uses reverse mode automatic differentiation to compute the gradient.

gradientAscent :: (Traversable f, Fractional a, Ord a) => (forall s. Mode s => f (AD s a) -> AD s a) -> f a -> [f a]Source