Portability | POSIX |
---|---|
Stability | experimental |
Maintainer | andrew+cabal@tullo.ch |
Safe Haskell | None |
DeepLearning.ConvNet
Description
- type Vol sh = Array U sh Double
- type DVol sh = Array D sh Double
- type Label = Int
- class (Shape sh, Shape sh') => Layer a sh sh' | a -> sh, a -> sh'
- class (Layer a sh sh', Shape sh, Shape sh') => InnerLayer a sh sh' | a -> sh, a -> sh'
- class Layer a DIM1 DIM1 => TopLayer a
- data SoftMaxLayer = SoftMaxLayer
- data FullyConnectedLayer sh = FullyConnectedLayer {}
- (>->) :: (Monad m, Shape sh, Shape sh', Shape sh'') => Forward m sh sh' -> Forward m sh' sh'' -> Forward m sh sh''
- type Forward m sh sh' = Vol sh -> WriterT [Vector Double] m (DVol sh')
- withActivations :: Forward m sh sh' -> Vol sh -> m (DVol sh', [Vector Double])
- flowNetwork :: (Monad m, Shape sh) => sh -> Int -> Int -> Int -> Forward m sh DIM1
- net1 :: (Monad m, InnerLayer a sh DIM1, TopLayer a1) => a -> a1 -> Forward m sh DIM1
- net2 :: (Monad m, InnerLayer a sh sh', InnerLayer a1 sh' DIM1, TopLayer a2) => a -> a1 -> a2 -> Forward m sh DIM1
- newFC :: Shape sh => sh -> Int -> FullyConnectedLayer sh
Main Types
Layers
class (Shape sh, Shape sh') => Layer a sh sh' | a -> sh, a -> sh'Source
Layer
reprsents a layer that can pass activations forward.
TopLayer
and InnerLayer
are derived layers that can be
backpropagated through.
Instances
Layer SoftMaxLayer DIM1 DIM1 | |
Shape sh => Layer (FullyConnectedLayer sh) sh DIM1 |
class (Layer a sh sh', Shape sh, Shape sh') => InnerLayer a sh sh' | a -> sh, a -> sh'Source
InnerLayer
represents an inner layer of a neural network that
can accept backpropagation input from higher layers
Instances
Shape sh => InnerLayer (FullyConnectedLayer sh) sh DIM1 |
class Layer a DIM1 DIM1 => TopLayer a Source
TopLayer
is a top level layer that can initialize a
backpropagation pass.
Instances
data SoftMaxLayer Source
SoftMaxLayer
computes the softmax activation function.
Constructors
SoftMaxLayer |
Instances
data FullyConnectedLayer sh Source
FullyConnectedLayer
represents a fully-connected input layer
Instances
Shape sh => InnerLayer (FullyConnectedLayer sh) sh DIM1 | |
Shape sh => Layer (FullyConnectedLayer sh) sh DIM1 |
Composing layers
(>->) :: (Monad m, Shape sh, Shape sh', Shape sh'') => Forward m sh sh' -> Forward m sh' sh'' -> Forward m sh sh''Source
>->
composes two forward activation functions
type Forward m sh sh' = Vol sh -> WriterT [Vector Double] m (DVol sh')Source
The Forward
function represents a single forward pass through a layer.
withActivations :: Forward m sh sh' -> Vol sh -> m (DVol sh', [Vector Double])Source
withActivations
computes the output activation, along with the
intermediate activations
Network building helpers
flowNetwork :: (Monad m, Shape sh) => sh -> Int -> Int -> Int -> Forward m sh DIM1Source
FlowNetwork
builds a network of the form
Input Layer Output Softmax +--+ | | Inner Layers +--+ +--+ | | | | | | | | +-+ +-+ +-+ | | | | | +---+ +---+ +--+ +--+ +---> | | | +-+ +-+ +-+ | | | | | | | | | | | | +--+ +--+ +--+
net1 :: (Monad m, InnerLayer a sh DIM1, TopLayer a1) => a -> a1 -> Forward m sh DIM1Source
net1
constructs a single-layer fully connected perceptron with
softmax output.
net2 :: (Monad m, InnerLayer a sh sh', InnerLayer a1 sh' DIM1, TopLayer a2) => a -> a1 -> a2 -> Forward m sh DIM1Source
net1
constructs a two-layer fully connected MLP with
softmax output.