Portability | portable |
---|---|

Stability | experimental |

Maintainer | olivier.boudry@gmail.com |

The Fast Artificial Neural Network Library (FANN) is a free open source neural network library written in C with support for both fully connected and sparsely connected networks (http://leenissen.dk/fann/).

HFANN is a Haskell interface to this library.

- withStandardFann :: [Int] -> (FannPtr -> IO a) -> IO a
- withSparseFann :: Float -> [Int] -> (FannPtr -> IO a) -> IO a
- withShortcutFann :: [Int] -> (FannPtr -> IO a) -> IO a
- randomizeWeights :: FannPtr -> (FannType, FannType) -> IO ()
- initWeights :: FannPtr -> TrainDataPtr -> IO ()
- runFann :: FannPtr -> [FannType] -> IO [FannType]
- printConnections :: FannPtr -> IO ()
- printParameters :: FannPtr -> IO ()
- getOutputNodesCount :: FannPtr -> IO Int
- getInputNodesCount :: FannPtr -> IO Int
- getTotalNodesCount :: FannPtr -> IO Int
- getConnectionsCount :: FannPtr -> IO Int

# ANN Creation

Create a new standard fully connected Neural Network and call the given function with the ANN as argument. When finished destroy the Neural Network.

The structure of the ANN is given by the first parameter. It's an Int list giving the number of nodes per layer from input layer to output layer.

Example: `[2,3,1]`

would describe an ANN with 2 nodes in the input layer,
one hidden layer of 3 nodes and 1 node in the output layer.

The function provided as second argument will be called with the created ANN as parameter.

:: Float | The ratio of connections |

-> [Int] | The ANN structure |

-> (FannPtr -> IO a) | A function using the ANN |

-> IO a | The return value |

Create a new sparse not fully connected Neural Network and call the given function with the ANN as argument. When finished destroy the ANN.

Create a new sparse not fully connected Neural Network with shortcut connections between layers and call the given function with the ANN as argument. When finished destroy the Neural Network

# ANN Initialization

Randomize weights to values in the given range

Weights in a newly created ANN are already initialized to random values. You can use this function if you want to customize the random weights upper and lower bounds.

:: FannPtr | The ANN |

-> TrainDataPtr | The training data used to calibrate the weights |

-> IO () |

Initialize the weights using Widrow + Nguyens algorithm.

This function behaves similarly to fann_randomize_weights. It will use the algorithm developed by Derrick Nguyen and Bernard Widrow to set the weights in such a way as to speed up training. This technique is not always successful, and in some cases can be less efficient than a purely random initialization.

The algorithm requires access to the range of the input data (ie, largest and smallest input), and therefore accepts a second argument, data, which is the training data that will be used to train the network.

# ANN Use

Run the trained Neural Network on provided input

printConnections :: FannPtr -> IO ()Source

Print the ANN connections

printParameters :: FannPtr -> IO ()Source

Print the ANN parameters

# ANN Information

Return the number of output nodes of the Neural Network

Return the number of input nodes of the Neural Network

Return the total number of nodes of the Neural Network

Return the total number of connections of the Neural Network