LambdaNet- A configurable and extensible neural network library

Safe HaskellNone




data LayerDefinition a Source

The LayerDefinition type is an intermediate type initialized by the library user to define the different layers of the network.



data Layer a Source

The Layer type, which stores the weight matrix, the bias matrix, and a neuron type.



data ShowableLayer a Source

We have to define a new type to be able to serialize and store networks.




weights :: Matrix a
biases :: Vector a


(Show a, Element a) => Show (ShowableLayer a) 
(Element a, Binary a) => Binary (ShowableLayer a)

We want Showable layer to be packable in the binary format, so we define it as an instance of showable.

type Connectivity a = Int -> Int -> Matrix a Source

Connectivity is the type alias for a function that defines the connective matrix for two layers (fully connected, convolutionally connected, etc.)

type RandomTransform a = [a] -> [a] Source

A random transformation type alias. It is a transformation defined on an infinite list of uniformly distributed random numbers, and returns a list distributed on the transforming distribution.

layerToShowable :: (Floating (Vector a), Container Vector a, Floating a) => Layer a -> ShowableLayer a Source

We want to be able to convert between layers and showable layers, and vice-versa

showableToLayer :: (Floating (Vector a), Container Vector a, Floating a) => (ShowableLayer a, LayerDefinition a) -> Layer a Source

To go from a showable to a layer, we also need a neuron type, which is an unfortunate restriction owed to Haskell's inability to serialize functions.

createLayer :: (RandomGen g, Random a, Floating (Vector a), Container Vector a, Floating a) => RandomTransform a -> g -> LayerDefinition a -> LayerDefinition a -> Layer a Source

The createLayer function takes in a random transformation on an infinite stream of uniformly generated numbers, a source of entropy, and two layer definitions, one for the previous layer and one for the next layer. It returns a layer defined by the Layer type -- a weight matrix, a bias vector, and a neuron type.

connectFully :: Int -> Int -> Matrix Float Source

The connectFully function takes the number of input neurons for a layer, i, and the number of output neurons of a layer, j, and returns an i x j connectivity matrix for a fully connected network.

randomList :: (RandomGen g, Random a, Floating a) => RandomTransform a -> g -> [a] Source

Initialize an infinite random list given a random transform and a source of entroy.

boxMuller :: Floating a => a -> a -> (a, a) Source

Define a transformation on the uniform distribution to generate normally distributed numbers in Haskell (the Box-Muller transform)

normals :: Floating a => [a] -> [a] Source

This is a function of type RandomTransform that transforms a list of uniformly distributed numbers to a list of normally distributed numbers.

uniforms :: Floating a => [a] -> [a] Source

A non-transformation to return a list of uniformly distributed numbers from a list of uniformly distributed numbers. It's really a matter of naming consistency. It generates numbers on the range (0, 1]

boundedUniforms :: Floating a => (a, a) -> [a] -> [a] Source

An affine transformation to return a list of uniforms on the range (a, b]