deeplearning-hs-0.1.0.2: Deep Learning in Haskell

PortabilityPOSIX
Stabilityexperimental
Maintainerandrew+cabal@tullo.ch
Safe HaskellNone

DeepLearning.ConvNet

Contents

Description

 

Synopsis

Main Types

type Vol sh = Array U sh DoubleSource

Activation matrix

type DVol sh = Array D sh DoubleSource

Delayed activation matrix

type Label = IntSource

Label for supervised learning

Layers

class (Shape sh, Shape sh') => Layer a sh sh' | a -> sh, a -> sh'Source

Layer reprsents a layer that can pass activations forward. TopLayer and InnerLayer are derived layers that can be backpropagated through.

class (Layer a sh sh', Shape sh, Shape sh') => InnerLayer a sh sh' | a -> sh, a -> sh'Source

InnerLayer represents an inner layer of a neural network that can accept backpropagation input from higher layers

Instances

class Layer a DIM1 DIM1 => TopLayer a Source

TopLayer is a top level layer that can initialize a backpropagation pass.

data SoftMaxLayer Source

SoftMaxLayer computes the softmax activation function.

Constructors

SoftMaxLayer 

data FullyConnectedLayer sh Source

FullyConnectedLayer represents a fully-connected input layer

Constructors

FullyConnectedLayer 

Fields

_weights :: Vol (sh :. Int)
 
_bias :: Vol DIM1
 

Composing layers

(>->) :: (Monad m, Shape sh, Shape sh', Shape sh'') => Forward m sh sh' -> Forward m sh' sh'' -> Forward m sh sh''Source

>-> composes two forward activation functions

type Forward m sh sh' = Vol sh -> WriterT [Vector Double] m (DVol sh')Source

The Forward function represents a single forward pass through a layer.

withActivations :: Forward m sh sh' -> Vol sh -> m (DVol sh', [Vector Double])Source

withActivations computes the output activation, along with the intermediate activations

Network building helpers

flowNetwork :: (Monad m, Shape sh) => sh -> Int -> Int -> Int -> Forward m sh DIM1Source

FlowNetwork builds a network of the form

  Input Layer              Output Softmax
     +--+
     |  |   Inner Layers    +--+   +--+
     |  |                   |  |   |  |
     |  |   +-+   +-+  +-+  |  |   |  |
     |  +---+ +---+ +--+ +--+  +--->  |
     |  |   +-+   +-+  +-+  |  |   |  |
     |  |                   |  |   |  |
     |  |                   +--+   +--+
     +--+

net1 :: (Monad m, InnerLayer a sh DIM1, TopLayer a1) => a -> a1 -> Forward m sh DIM1Source

net1 constructs a single-layer fully connected perceptron with softmax output.

net2 :: (Monad m, InnerLayer a sh sh', InnerLayer a1 sh' DIM1, TopLayer a2) => a -> a1 -> a2 -> Forward m sh DIM1Source

net1 constructs a two-layer fully connected MLP with softmax output.

newFC :: Shape sh => sh -> Int -> FullyConnectedLayer shSource

newFC constructs a new fully connected layer