úÎEXC°     (c) 2012 Alp MestanogullariBSD3alpmestan@gmail.com experimentalGHCNone !":DRTList of s'Input vector and expected output vectorVThe type of an activation function's derivative, mostly used for clarity in signaturesIThe type of an activation function, mostly used for clarity in signatures/Our feed-forward neural network type. Note the ' instance, which means you can use  and g in case you need to serialize your neural nets somewhere else than in a file (e.g over the network)the weight matrices,The following creates a neural network with n inputs and if lÆ is [n1, n2, ...] the net will have n1 neurons on the first layer, n2 neurons on the second, and so on ending with k neurons on the output layer, with random weight matrices as a courtesy of . createNetwork n l k —Creates a neural network with exactly the weight matrices given as input here. We don't check that the numbers of rows/columns are compatible, etc. _Computes the output of the network on the given input vector with the given activation functionhComputes and keeps the output of all the layers of the neural network with the given activation function ÿcHandy operator to describe your learning set, avoiding unnecessary parentheses. It's just a synonym for '(,)'. Generally you'll load your learning set from a file, a database or something like that, but it can be nice for quickly playing with hnn or for simple problems where you manually specify your learning set. That is, instead of writing: Ösamples :: Samples Double samples = [ (fromList [0, 0], fromList [0]) , (fromList [0, 1], fromList [1]) , (fromList [1, 0], fromList [1]) , (fromList [1, 1], fromList [0]) ]You can write: Úsamples :: Samples Double samples = [ fromList [0, 0] --> fromList [0] , fromList [0, 1] --> fromList [1] , fromList [1, 0] --> fromList [1] , fromList [1, 1] --> fromList [0] ] Generic training function.ïThe first argument is a predicate that will tell the backpropagation algorithm when to stop. The first argument to the predicate is the epoch, i.e the number of times the backprop has been executed on the samples. The second argument is the current networkk, and the third is the list of samples. You can thus combine these arguments to create your own criterion.áFor example, if you want to stop learning either when the network's quadratic error on the samples, using the tanh function, is below 0.01, or after 1000 epochs, whichever comes first, you could use the following predicate: \pred epochs net samples = if epochs == 1000 then True else quadError tanh net samples < 0.01You could even use § to print the error, to see how the error evolves while it's learning, or redirect this to a file from your shell in order to generate a pretty graphics and what not.ÆThe second argument (after the predicate) is the learning rate. Then come the activation function you want, its derivative, the initial neural network, and your training set. Note that we provide   and  for common use cases. TTrains the neural network with backpropagation the number of times specified by the _ argument, using the given learning rate (second argument). yQuadratic error on the given training set using the given activation function. Useful to create your own predicates for  .5Trains the neural network until the quadratic error (a) comes below the given value (first argument), using the given learning rate (second argument).Note: this can loop pretty much forever when you're using a bad architecture for the problem, or inappropriate activation functions.)The sigmoid function: 1 / (1 + exp (-x))?Derivative of the sigmoid function: sigmoid x * (1 - sigmoid x)Derivative of the  function from the Prelude.¢Loading a neural network from a file (uses zlib compression on top of serialization using the binary package). Will throw an exception if the file isn't there.kSaving a neural network to a file (uses zlib compression on top of serialization using the binary package).                  !"#$%&'()*+,hnn-0.3-7n42rwOvZe4ARcyGP5X2WuAI.HNN.FF.Network Debug.Tracetracebase GHC.FloattanhSamplesSampleActivationFunctionDerivativeActivationFunctionNetworkmatrices createNetworkfromWeightMatricesoutput--> trainUntil trainNTimes quadErrortrainUntilErrorBelowsigmoidsigmoid'tanh' loadNetwork saveNetwork$fBinaryNetwork $fShowNetworkbinary-0.8.3.0Data.Binary.ClassBinary Data.Binaryencodedecode*mwc-random-0.13.4.0-IrRYM0eHtacIO2LX1EGqKTSystem.Random.MWC uniformVectoroutputsghc-prim GHC.TypesIntdeltas updateNetwork backpropOnce