| Copyright | (c) Edward Kmett 2010-2015 |
|---|---|
| License | BSD3 |
| Maintainer | ekmett@gmail.com |
| Stability | experimental |
| Portability | GHC only |
| Safe Haskell | None |
| Language | Haskell2010 |
Numeric.AD.Rank1.Newton
Description
- findZero :: (Fractional a, Eq a) => (Forward a -> Forward a) -> a -> [a]
- findZeroNoEq :: Fractional a => (Forward a -> Forward a) -> a -> [a]
- inverse :: (Fractional a, Eq a) => (Forward a -> Forward a) -> a -> a -> [a]
- inverseNoEq :: Fractional a => (Forward a -> Forward a) -> a -> a -> [a]
- fixedPoint :: (Fractional a, Eq a) => (Forward a -> Forward a) -> a -> [a]
- fixedPointNoEq :: Fractional a => (Forward a -> Forward a) -> a -> [a]
- extremum :: (Fractional a, Eq a) => (On (Forward (Forward a)) -> On (Forward (Forward a))) -> a -> [a]
- extremumNoEq :: Fractional a => (On (Forward (Forward a)) -> On (Forward (Forward a))) -> a -> [a]
- gradientDescent :: (Traversable f, Fractional a, Ord a) => (f (Kahn a) -> Kahn a) -> f a -> [f a]
- gradientAscent :: (Traversable f, Fractional a, Ord a) => (f (Kahn a) -> Kahn a) -> f a -> [f a]
Newton's Method (Forward)
findZero :: (Fractional a, Eq a) => (Forward a -> Forward a) -> a -> [a] Source #
The findZero function finds a zero of a scalar function using
Newton's method; its output is a stream of increasingly accurate
results. (Modulo the usual caveats.) If the stream becomes constant
("it converges"), no further elements are returned.
Examples:
>>>take 10 $ findZero (\x->x^2-4) 1[1.0,2.5,2.05,2.000609756097561,2.0000000929222947,2.000000000000002,2.0]
>>>last $ take 10 $ findZero ((+1).(^2)) (1 :+ 1)0.0 :+ 1.0
findZeroNoEq :: Fractional a => (Forward a -> Forward a) -> a -> [a] Source #
The findZeroNoEq function behaves the same as findZero except that it
doesn't truncate the list once the results become constant. This means it
can be used with types without an Eq instance.
inverse :: (Fractional a, Eq a) => (Forward a -> Forward a) -> a -> a -> [a] Source #
The inverse function inverts a scalar function using
Newton's method; its output is a stream of increasingly accurate
results. (Modulo the usual caveats.) If the stream becomes
constant ("it converges"), no further elements are returned.
Example:
>>>last $ take 10 $ inverse sqrt 1 (sqrt 10)10.0
inverseNoEq :: Fractional a => (Forward a -> Forward a) -> a -> a -> [a] Source #
The inverseNoEq function behaves the same as inverse except that it
doesn't truncate the list once the results become constant. This means it
can be used with types without an Eq instance.
fixedPoint :: (Fractional a, Eq a) => (Forward a -> Forward a) -> a -> [a] Source #
The fixedPoint function find a fixedpoint of a scalar
function using Newton's method; its output is a stream of
increasingly accurate results. (Modulo the usual caveats.)
If the stream becomes constant ("it converges"), no further elements are returned.
>>>last $ take 10 $ fixedPoint cos 10.7390851332151607
fixedPointNoEq :: Fractional a => (Forward a -> Forward a) -> a -> [a] Source #
The fixedPointNoEq function behaves the same as fixedPoint except that
it doesn't truncate the list once the results become constant. This means it
can be used with types without an Eq instance.
extremum :: (Fractional a, Eq a) => (On (Forward (Forward a)) -> On (Forward (Forward a))) -> a -> [a] Source #
The extremum function finds an extremum of a scalar
function using Newton's method; produces a stream of increasingly
accurate results. (Modulo the usual caveats.) If the stream
becomes constant ("it converges"), no further elements are returned.
>>>last $ take 10 $ extremum cos 10.0
extremumNoEq :: Fractional a => (On (Forward (Forward a)) -> On (Forward (Forward a))) -> a -> [a] Source #
The extremumNoEq function behaves the same as extremum except that it
doesn't truncate the list once the results become constant. This means it
can be used with types without an Eq instance.
Gradient Ascent/Descent (Kahn)
gradientDescent :: (Traversable f, Fractional a, Ord a) => (f (Kahn a) -> Kahn a) -> f a -> [f a] Source #
The gradientDescent function performs a multivariate
optimization, based on the naive-gradient-descent in the file
stalingrad/examples/flow-tests/pre-saddle-1a.vlad from the
VLAD compiler Stalingrad sources. Its output is a stream of
increasingly accurate results. (Modulo the usual caveats.)
It uses reverse mode automatic differentiation to compute the gradient.
gradientAscent :: (Traversable f, Fractional a, Ord a) => (f (Kahn a) -> Kahn a) -> f a -> [f a] Source #
Perform a gradient descent using reverse mode automatic differentiation to compute the gradient.