Changelog for ad-4.4
4.4 [2020.02.03]
-
Generalize the type of
stochasticGradientDescent:-stochasticGradientDescent :: (Traversable f, Fractional a, Ord a) => (forall s. Reifies s Tape => f (Scalar a) -> f (Reverse s a) -> Reverse s a) -> [f (Scalar a)] -> f a -> [f a] +stochasticGradientDescent :: (Traversable f, Fractional a, Ord a) => (forall s. Reifies s Tape => e -> f (Reverse s a) -> Reverse s a) -> [e] -> f a -> [f a]
4.3.6 [2019.02.28]
- Make the test suite pass when built against
musllibc.
4.3.5 [2018.01.18]
- Add
Semigroupinstance forId.
4.3.4
- Support
doctest-0.12
4.3.3
- Revamp
Setup.hsto usecabal-doctest. This makes it build withCabal-2.0, and makes thedoctests work withcabal new-buildand sandboxes.
4.3.2.1
- GHC 8 support
- Fix Kahn mode's
**implementation - Fix multiple problems in Erf and InvErf methods
4.3.2
- Added
NoEqversions of several combinators that can be used whenEqisn't available on the numeric type involved.
4.3.1
- Further improvements have been made in the performance of
Sparsemode, at least asymptotically, when used on functions with many variables. Since this is the target use-case forSparsein the first place, this seems like a good trade-off. Note: this results in an API change, but only in the API of anInternalmodule, so this is treated as a minor version bump.
4.3
- Made drastic improvements in the performance of
TowerandSparsemodes thanks to the help of Björn von Sydow. - Added constrained convex optimization.
- Incorporated some suggestions from herbie for improving floating point accuracy.
4.2.4
- Added
Newton.Doublemodules for performance.
4.2.3
reflection2 support
4.2.2
- Major bug fix for
grads,jacobians, and anything that usesSparsemode inNumeric.AD. Derivatives after the first two were previously incorrect.
4.2.1.1
- Support
natsversion 1
4.2.1
- Added
stochasticGradientDescent.
4.2
- Removed broken
Directedmode. - Added
Numeric.AD.Rank1combinators and moved most infinitesimal handling back out of the modes and into anADwrapper.
4.1
- Fixed a bug in the type of
conjugateGradientAscentandconjugateGradientDescentthat prevent users from being able to ever call it.
4.0.0.1
- Added the missing
instances.hheader file toextra-source-files.
4.0
- An overhaul permitting monomorphic modes was completed by @alang9.
- Add a
ForwardDoublemonomorphic mode
3.4
- Added support for
erfandinverf, etc. fromData.Number.Erf. - Split the infinitesimal and mode into two separate parameters to facilitate inlining and easier extension of the API.
3.3.1
- Build system improvements
- Removed unused LANGUAGE pragmas
- Added HLint configuration
- We now use exactly the same versions of the packages used to build
adwhen running the doctests.
3.3
- Renamed
ReversetoKahnandWengerttoReverse. We use Arthur Kahn's topological sorting algorithm to sort the tape after the fact in Kahn mode, while the stock Reverse mode builds a Wengert list as it goes, which is more efficient in practice.
3.2.2
- Export of the
conjugateGradientDescentandgradientDescentfromNumeric.AD
3.2.1
conjugateGradientDescentnow stops before it starts returning NaN results.
3.2
- Renamed
ChaintoWengertto reflect its use of Wengert lists for reverse mode. - Renamed
lifttoautoto avoid conflict with the more prevalenttransformerslibrary. - Fixed a bug in
Numeric.AD.Forward.gradWith', which caused it to return the wrong value for the primal.
3.1.4
- Added a better "convergence" test for
findZero - Compute
tanandtanhderivatives directly.
3.1.3
- Added
conjugateGradientDescentandconjugateGradientAscenttoNumeric.AD.Newton.
3.1.2
- Dependency bump
3.1
- Added
Chainmode, which isReverseusing a linear tape that doesn't need to be sorted. - Added a suite of doctests.
- Bug fix in
Forwardmode. It was previously yielding incorrect results for anything that usedbindorbind'internally.
3.0
- Moved the contents of
Numeric.AD.Mode.MixedintoNumeric.AD - Split off
Numeric.AD.Variadicfor the variadic combinators - Removed the
UU,FU,UF, andFFtype aliases. - Stopped exporting the types for
ModeandADfrom almost every module. ImportNumeric.AD.Typesif necessary. - Renamed
TensorstoJet - Dependency bump to be compatible with ghc 7.4.1 and mtl 2.1
- More aggressive zero tracking.
diff (**n) 0for constant n anddiff (0**)both now yield the correct answer for all modes.