sgd-0.8.0.3: Stochastic gradient descent library

Safe HaskellNone
LanguageHaskell2010

Numeric.SGD.Adam

Description

Provides the adam function which implements the Adam algorithm based on the paper:

Synopsis

Documentation

data Config Source #

AdaDelta configuration

Constructors

Config 

Fields

Instances
Eq Config Source # 
Instance details

Defined in Numeric.SGD.Adam

Methods

(==) :: Config -> Config -> Bool #

(/=) :: Config -> Config -> Bool #

Ord Config Source # 
Instance details

Defined in Numeric.SGD.Adam

Show Config Source # 
Instance details

Defined in Numeric.SGD.Adam

Generic Config Source # 
Instance details

Defined in Numeric.SGD.Adam

Associated Types

type Rep Config :: Type -> Type #

Methods

from :: Config -> Rep Config x #

to :: Rep Config x -> Config #

Default Config Source # 
Instance details

Defined in Numeric.SGD.Adam

Methods

def :: Config #

type Rep Config Source # 
Instance details

Defined in Numeric.SGD.Adam

scaleTau :: Double -> Config -> Config Source #

Scale the tau parameter. Useful e.g. to account for the size of the training dataset.

adam Source #

Arguments

:: (Monad m, ParamSet p) 
=> Config

Adam configuration

-> (e -> p -> p)

Gradient on a training element

-> SGD m e p 

Perform gradient descent using the Adam algorithm. See Numeric.SGD.Adam for more information.