LambdaNet-0.2.0.0: A configurable and extensible neural network library

Safe HaskellNone
LanguageHaskell98

Network.Network

Synopsis

Documentation

data Network a Source

Networks are constructed front to back. Start by adding an input layer, then each hidden layer, and finally an output layer.

Constructors

Network 

Fields

layers :: [Layer a]
 

Instances

(Product a, Container Vector a, Floating (Vector a)) => Monoid (Network a)

We gain the ability to combine two networks of the same proportions by abstracting a network as a monoid. This is useful in backpropagation for batch training

createNetwork :: (RandomGen g, Random a, Floating a, Floating (Vector a), Container Vector a) => RandomTransform a -> g -> [LayerDefinition a] -> Network a Source

The createNetwork function takes in a random transform used for weight initialization, a source of entropy, and a list of layer definitions, and returns a network with the weights initialized per the random transform.

loadNetwork :: (Binary (ShowableLayer a), Floating a, Floating (Vector a), Container Vector a) => FilePath -> [LayerDefinition a] -> IO (Network a) Source

Given a filename, and a list of layer definitions, we want to reexpand the data back into a network.

emptyNetwork :: Network a Source

Our Unit, an empty network with no layers

isEmptyNetwork :: Network a -> Bool Source

A boolean to check if the network is the unit network or not

addNetworks :: (Floating (Vector a), Container Vector a, Product a) => Network a -> Network a -> Network a Source

A function to combine two networks

predict :: (Floating (Vector a), Container Vector a, Product a) => Vector a -> Network a -> Vector a Source

Predict folds over each layer of the network using the input vector as the first value of the accumulator. It operates on whatever network you pass in.

apply :: (Floating (Vector a), Container Vector a, Product a) => Vector a -> Layer a -> Vector a Source

A function used in the fold in predict that applies the activation function and pushes the input through a layer of the network.

saveNetwork :: (Binary (ShowableLayer a), Floating a, Floating (Vector a), Container Vector a) => FilePath -> Network a -> IO () Source

Given a filename and a network, we want to save the weights and biases of the network to the file for later use.