Safe Haskell | None |
---|---|
Language | Haskell98 |
- data Network a = Network {}
- type TrainingData a = (Vector a, Vector a)
- createNetwork :: (RandomGen g, Random a, Floating a, Floating (Vector a), Container Vector a) => RandomTransform a -> g -> [LayerDefinition a] -> Network a
- loadNetwork :: (Binary (ShowableLayer a), Floating a, Floating (Vector a), Container Vector a) => FilePath -> [LayerDefinition a] -> IO (Network a)
- predict :: (Floating (Vector a), Container Vector a, Product a) => Vector a -> Network a -> Vector a
- apply :: (Floating (Vector a), Container Vector a, Product a) => Vector a -> Layer a -> Vector a
- saveNetwork :: (Binary (ShowableLayer a), Floating a, Floating (Vector a), Container Vector a) => FilePath -> Network a -> IO ()
Documentation
Networks are constructed front to back. Start by adding an input layer, then each hidden layer, and finally an output layer.
type TrainingData a = (Vector a, Vector a) Source
A tuple of (input, expected output)
createNetwork :: (RandomGen g, Random a, Floating a, Floating (Vector a), Container Vector a) => RandomTransform a -> g -> [LayerDefinition a] -> Network a Source
The createNetwork function takes in a random transform used for weight initialization, a source of entropy, and a list of layer definitions, and returns a network with the weights initialized per the random transform.
loadNetwork :: (Binary (ShowableLayer a), Floating a, Floating (Vector a), Container Vector a) => FilePath -> [LayerDefinition a] -> IO (Network a) Source
Given a filename, and a list of layer definitions, we want to reexpand the data back into a network.
predict :: (Floating (Vector a), Container Vector a, Product a) => Vector a -> Network a -> Vector a Source
Predict folds over each layer of the network using the input vector as the first value of the accumulator. It operates on whatever network you pass in.