LambdaNet-0.1.0.1: A configurable and extensible neural network library

Safe HaskellNone
LanguageHaskell98

Network.Network

Synopsis

Documentation

data Network a Source

Networks are constructed front to back. Start by adding an input layer, then each hidden layer, and finally an output layer.

Constructors

Network 

Fields

layers :: [Layer a]
 

type TrainingData a = (Vector a, Vector a) Source

A tuple of (input, expected output)

createNetwork :: (RandomGen g, Random a, Floating a, Floating (Vector a), Container Vector a) => RandomTransform a -> g -> [LayerDefinition a] -> Network a Source

The createNetwork function takes in a random transform used for weight initialization, a source of entropy, and a list of layer definitions, and returns a network with the weights initialized per the random transform.

loadNetwork :: (Binary (ShowableLayer a), Floating a, Floating (Vector a), Container Vector a) => FilePath -> [LayerDefinition a] -> IO (Network a) Source

Given a filename, and a list of layer definitions, we want to reexpand the data back into a network.

predict :: (Floating (Vector a), Container Vector a, Product a) => Vector a -> Network a -> Vector a Source

Predict folds over each layer of the network using the input vector as the first value of the accumulator. It operates on whatever network you pass in.

apply :: (Floating (Vector a), Container Vector a, Product a) => Vector a -> Layer a -> Vector a Source

A function used in the fold in predict that applies the activation function and pushes the input through a layer of the network.

saveNetwork :: (Binary (ShowableLayer a), Floating a, Floating (Vector a), Container Vector a) => FilePath -> Network a -> IO () Source

Given a filename and a network, we want to save the weights and biases of the network to the file for later use.