Copyright | (c) Lars Brünjes, 2016 |
---|---|
License | MIT |
Maintainer | brunjlar@gmail.com |
Stability | experimental |
Portability | portable |
Safe Haskell | None |
Language | Haskell2010 |
Extensions |
|
This modules defines special "layer" components and convenience functions for the creation of such layers.
- type Layer i o = Component (Vector i Analytic) (Vector o Analytic)
- linearLayer :: forall i o. (KnownNat i, KnownNat o) => Layer i o
- layer :: (KnownNat i, KnownNat o) => (Analytic -> Analytic) -> Layer i o
- tanhLayer :: (KnownNat i, KnownNat o) => Layer i o
- logisticLayer :: (KnownNat i, KnownNat o) => Layer i o
- reLULayer :: (KnownNat i, KnownNat o) => Layer i o
- softmax :: (Floating a, Functor f, Foldable f) => f a -> f a
Documentation
type Layer i o = Component (Vector i Analytic) (Vector o Analytic) Source
A
is a component that maps a vector of length Layer
i oi
to a vector of length j
.
linearLayer :: forall i o. (KnownNat i, KnownNat o) => Layer i o Source
Creates a linear Layer
, i.e. a layer that multiplies the input with a weight matrix and adds a bias to get the output.
Random initialization follows the recommendation from chapter 3 of the online book Neural Networks and Deep Learning by Michael Nielsen.
layer :: (KnownNat i, KnownNat o) => (Analytic -> Analytic) -> Layer i o Source
Creates a Layer
as a combination of a linear layer and a non-linear activation function.
logisticLayer :: (KnownNat i, KnownNat o) => Layer i o Source
This is simply layer
, specialized to the logistic function as activation. Output values are all in the interval [-1,1].