úÎ:Y7á(      !"#$%&' Ertugrul Soeylemez <es@ertes.de>(3A connection vector contains the incoming weights. )*>A connection matrix is essentially a two-dimensional array of  synaptic weights. +,addLayer s1 n1 s2 n2 overwrite n1 nodes starting from s1 to / be fully connected with random weights to the n2 nodes starting  from s2. CBuild a layered connection matrix, where adjacent layers are fully  connected. @Build a completely random connection matrix with the given edge @ length. The random values will be between -1 and 1 exclusive. @Build a zero connection matrix. It will represent a completely 5 disconnected network, where all nodes are isolated. 9Add two connection matrices. Note that this function is A left-biased in that it will adopt the connectivity of the first  connection matrix. You may want to use the -$ instance instead of this function. 2Strictly fold over the outputs, including zeroes. 1Strictly fold over the nonzero inputs of a node. Map over the inputs of a node. $Edge length of a connection matrix. .4Returns a random number between -1 and 1 exclusive.      Ertugrul Soeylemez <es@ertes.de> Activation functions. Logistic activation. Apply an activation function. 0Apply the derivative of an activation function. /*This is the logistic activation function.      Ertugrul Soeylemez <es@ertes.de>A signal pattern. $Network builder configuration. See . @Recipe for a multi-layer perceptron. This is a neural network, B which is made up of neuron layers, where adjacent layers are (in  this case fully) connected. Network's activation function. "Layer sizes from input to output. A ( value is an aritifical neural network. Activation function. Connection matrix. Number of input neurons. Number of output neurons. AFeeds the given input vector into the network and calculates the  activation vector. :Build a random neural network from the given description. (Construct a pattern vector from a list. <Calculate the net input vector, i.e. the values just before # applying the activation function. ACalculate the net input vector from the given activation vector. CThe total discrepancy between the two given patterns. Can be used ' to calculate the total network error. BPass the given input pattern through the given neural network and  return its output. Convenience wrapper around ! using lists instead of vectors. " If you care for performance, use .     Ertugrul Soeylemez <es@ertes.de>!BA training pattern is a tuple of an input pattern and an expected  output pattern. "=Calculate the weight deltas and the total error for a single < pattern. The second argument specifies the learning rate. #BCalculate the total error of a neural network with respect to the " given list of training patterns. $BConvenience function: Construct a training pattern from an input  and output vector. %Non-atomic version of &. Will adjust the weights for 2 each pattern instead of at the end of the epoch. &ATrain a list of patterns with the specified learning rate. This F will adjust the weights at the end of the epoch. Returns an updated ) neural network and the new total error. 'DTrain a single pattern. The second argument specifies the learning  rate. !"#$%&'!%&'"#$!"#$%&' Ertugrul Soeylemez <es@ertes.de>  0      !"#$%&'()*+,-./01234567instinct-0.1.0AI.Instinct.ConnMatrixAI.Instinct.ActivationAI.Instinct.BrainAI.Instinct.Train.Delta AI.Instinct ConnMatrixaddLayer buildLayered buildRandom buildZerocmAddcmDestscmFoldcmMapcmSize Activation LogisticActactFuncactDerivPatternNetInitInitMLP mlpActFunc mlpLayersBrainbrainAct brainConns brainInputs brainOutputs activationbuildNetlistPatnetInput netInputFrompatErrorrunNet runNetListTrainPatlearnPat totalErrortpListtrain trainAtomictrainPat ConnVectorCVgetCVCMgetCMbase Data.MonoidMonoidrandom1logistic