a^ZC      !"#$%&'()*+,-./0123456789:;<=>?@AB Safe-InferredUsing this structure allows users of the library to create their own neurons by creating two functions - an activation function and its derivative - and packaging them up into a neuron type./Our provided neuron types: sigmoid, tanh, reclu _The sigmoid activation function, a standard activation function defined on the range (0, 1). hThe derivative of the sigmoid function conveniently can be computed in terms of the sigmoid function. The hyperbolic tangent activation function is provided in Prelude. Here we provide the derivative. As with the sigmoid function, the derivative of tanh can be computed in terms of tanh. The rectified linear activation function. This is a more "biologically accurate" activation function that still retains differentiability.RThe derivative of the rectified linear activation function is just the sigmoid.     None3A random transformation type alias. It is a transformation defined on an infinite list of uniformly distributed random numbers, and returns a list distributed on the transforming distribution.Connectivity is the type alias for a function that defines the connective matrix for two layers (fully connected, convolutionally connected, etc.)KWe have to define a new type to be able to serialize and store networks.VThe Layer type, which stores the weight matrix, the bias matrix, and a neuron type.The LayerDefinition type is an intermediate type initialized by the library user to define the different layers of the network.EThe createLayer function takes in a random transformation on an infinite stream of uniformly generated numbers, a source of entropy, and two layer definitions, one for the previous layer and one for the next layer. It returns a layer defined by the Layer type -- a weight matrix, a bias vector, and a neuron type. The connectFully function takes the number of input neurons for a layer, i, and the number of output neurons of a layer, j, and returns an i x j connectivity matrix for a fully connected network.!SWe want to be able to convert between layers and showable layers, and vice-versa"To go from a showable to a layer, we also need a neuron type, which is an unfortunate restriction owed to Haskell's inability to serialize functions.#VInitialize an infinite random list given a random transform and a source of entroy.$Define a transformation on the uniform distribution to generate normally distributed numbers in Haskell (the Box-Muller transform)%This is a function of type RandomTransform that transforms a list of uniformly distributed numbers to a list of normally distributed numbers.&A non-transformation to return a list of uniformly distributed numbers from a list of uniformly distributed numbers. It's really a matter of naming consistency. It generates numbers on the range (0, 1]'LAn affine transformation to return a list of uniforms on the range (a, b]CjWe want Showable layer to be packable in the binary format, so we define it as an instance of showable. !"#$%&'C !"#$%&'!" #$%&' !"#$%&'CNone3(#A tuple of (input, expected output))Networks are constructed front to back. Start by adding an input layer, then each hidden layer, and finally an output layer.,The createNetwork function takes in a random transform used for weight initialization, a source of entropy, and a list of layer definitions, and returns a network with the weights initialized per the random transform.-Predict folds over each layer of the network using the input vector as the first value of the accumulator. It operates on whatever network you pass in..A function used in the fold in predict that applies the activation function and pushes the input through a layer of the network./sGiven a filename and a network, we want to save the weights and biases of the network to the file for later use.0gGiven a filename, and a list of layer definitions, we want to reexpand the data back into a network. ()*+,-./0 ()*+,-./0 )*+(,0-./()*+,-./0None314A selection function for performing gradient descent27A CostFunction' (derivative) is used in backPropagation3QA CostFunction is used for evaluating a network's performance on a given input4Trainer is a typeclass for all trainer types - a trainer will take in an instance of itself, a network, a list of training data, and return a new network trained on the data. class Trainer a where fit :: (Floating b) => a -> Network b -> [TrainingData b] -> Network bA BackpropTrainer performs simple backpropagation on a neural network. It can be used as the basis for more complex trainers.93The quadratic cost function (1/2) * sum (y - a) ^ 2:9The derivative of the quadratic cost function sum (y - a);\The minibatch function becomes a Selection when partially applied with the minibatch size<&If we want to train the network online=tDeclare the BackpropTrainer to be an instance of Trainer. instance (Floating a) => Trainer (BackpropTrainer a) where>;Perform backpropagation on a single training data instance.DAUpdate the weights and biases of a network given a list of deltasEEThe mapped function to update the weight and biases in a single layer?]The outputs function scans over each layer of the network and stores the activated results@nThe inputs function performs a similar task to outputs, but returns a list of vectors of unactivated inputsA3The deltas function returns a list of layer deltas.FCompute the hidden layer deltasB9Use the cost function to determine the error of a network123456789:;<=>DE?@AFB123456789:;<=>?@AB456783219:;<>@?A=B123456789:;<=>DE?@AFBG       !"#$%&'()*+,,-./01234566789:;<=>?@ABCDEFGHLambdaNet-0.1.0.1Network.Neuron Network.LayerNetwork.NetworkNetwork.Trainerbase GHC.FloattanhActivationFunction'ActivationFunctionNeuron activation activation' sigmoidNeuron tanhNeuron recluNeuronsigmoidsigmoid'tanh'reclureclu'RandomTransform Connectivity ShowableLayerweightsbiasesLayer weightMatrix biasVectorneuronLayerDefinition neuronDef neuronCountconnect createLayer connectFullylayerToShowableshowableToLayer randomList boxMullernormalsuniformsboundedUniforms TrainingDataNetworklayers createNetworkpredictapply saveNetwork loadNetwork Selection CostFunction' CostFunctionBackpropTraineretacostcost' quadraticCostquadraticCost' minibatchonlinefitbackpropoutputsinputsdeltasevaluate$fBinaryShowableLayer updateNetwork updateLayer hiddenDeltas