hfann-0.2: Haskell binding to the FANN library

Portabilityportable
Stabilityexperimental
Maintainerolivier.boudry@gmail.com

HFANN.Base

Contents

Description

The Fast Artificial Neural Network Library (FANN) is a free open source neural network library written in C with support for both fully connected and sparsely connected networks (http://leenissen.dk/fann/).

HFANN is a Haskell interface to this library.

Synopsis

ANN Creation

withStandardFannSource

Arguments

:: [Int]

The ANN structure

-> (FannPtr -> IO a)

A function using the ANN

-> IO a

The return value

Create a new standard fully connected Neural Network and call the given function with the ANN as argument. When finished destroy the Neural Network.

The structure of the ANN is given by the first parameter. It's an Int list giving the number of nodes per layer from input layer to output layer.

Example: [2,3,1] would describe an ANN with 2 nodes in the input layer, one hidden layer of 3 nodes and 1 node in the output layer.

The function provided as second argument will be called with the created ANN as parameter.

withSparseFannSource

Arguments

:: Float

The ratio of connections

-> [Int]

The ANN structure

-> (FannPtr -> IO a)

A function using the ANN

-> IO a

The return value

Create a new sparse not fully connected Neural Network and call the given function with the ANN as argument. When finished destroy the ANN.

withShortcutFannSource

Arguments

:: [Int]

The ANN structure

-> (FannPtr -> IO a)

A function using the ANN

-> IO a

The return value

Create a new sparse not fully connected Neural Network with shortcut connections between layers and call the given function with the ANN as argument. When finished destroy the Neural Network

ANN Initialization

randomizeWeightsSource

Arguments

:: FannPtr

The ANN

-> (FannType, FannType)

min and max bounds for weights initialization

-> IO () 

Randomize weights to values in the given range

Weights in a newly created ANN are already initialized to random values. You can use this function if you want to customize the random weights upper and lower bounds.

initWeightsSource

Arguments

:: FannPtr

The ANN

-> TrainDataPtr

The training data used to calibrate the weights

-> IO () 

Initialize the weights using Widrow + Nguyens algorithm.

This function behaves similarly to fann_randomize_weights. It will use the algorithm developed by Derrick Nguyen and Bernard Widrow to set the weights in such a way as to speed up training. This technique is not always successful, and in some cases can be less efficient than a purely random initialization.

The algorithm requires access to the range of the input data (ie, largest and smallest input), and therefore accepts a second argument, data, which is the training data that will be used to train the network.

ANN Use

runFannSource

Arguments

:: FannPtr

The ANN

-> [FannType]

A list of inputs

-> IO [FannType]

A list of outputs

Run the trained Neural Network on provided input

printConnections :: FannPtr -> IO ()Source

Print the ANN connections

printParameters :: FannPtr -> IO ()Source

Print the ANN parameters

ANN Information

getOutputNodesCountSource

Arguments

:: FannPtr

The ANN

-> IO Int

The number of output nodes

Return the number of output nodes of the Neural Network

getInputNodesCountSource

Arguments

:: FannPtr

The ANN

-> IO Int

The number of input nodes

Return the number of input nodes of the Neural Network

getTotalNodesCountSource

Arguments

:: FannPtr

The ANN

-> IO Int

The number of nodes

Return the total number of nodes of the Neural Network

getConnectionsCountSource

Arguments

:: FannPtr

The ANN

-> IO Int

The number of connections

Return the total number of connections of the Neural Network