úÎ!…{Á™      !"#$%&'()*+,-./0123456789:;< = > ? @ A B C D E F G H I J K L MNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜Safe™š›œžŸ None  tensor-safe-Transforms an InterpreterError into a string. tensor-safe"Lifts putStrLn to the Interpreter.None‡ tensor-safeCompilation interface for the $ command. Given a path, module name.¡ tensor-safeInvokes ¢ to generate the CNetwork in the file with the specified path. Depending on the out parameter, the output will be redirected to the stdout or the the out path.None÷ tensor-safe?Checks if the file at the specified path compiles successfully.Noneà  tensor-safeQClass that defines which languages are supported for CNetworks generation to text tensor-safe:Adds supports for a language. Generates a CNetwork to Text tensor-safe Similar to r, but also adds necessary header and module lines of text so as to have the CNetwork compiled at a separate file.¢ tensor-safe†Defines how are the layers going to be translated to the domain language This translates DLayer to String for each supported language tensor-safeSupport for Python compilation  tensor-safe"Support for JavaScript compilation  tensor-safe Defines the tensor-safeYAuxiliary data representation of Layers IMPORTANT: If you add new Layers definitions to K, you should add the corresponding data structure here for the same layer.£ tensor-safe2Converts a map to a parameter object in JavaScript¤ tensor-safe-Converts a map to keyword arguments in Python   Safe &'.HUV#œ( tensor-safe#Wrapper for a tuple of 2 Nat values* tensor-safeWrapper for a Nat value, tensor-safeFCompares two types in kinds level and raises error if they don't match- tensor-safe!Compares two types in kinds level. tensor-safe3Multiplies all numbers on a list of natural numbers()*+,-..-,*+()None)n1 tensor-safe¥Defines that a type is a Layer Each layer can be compilated into a specific CNetwork expression which can later be used to generate code to a specific backend.2 tensor-safeThe layer type3 tensor-safeHGiven the layer and a optional inputShape generates a CNetwork structure4 tensor-safe(Auxiliary type for Input Shape parameter12344132None&'-.H+65 tensor-safeCAdds the dimensions of the shapes to a list of values with shape D155None &'.HVX-&8 tensor-safeMA classic BatchNormalization layer with axis, momentum and epsilon parameters8989 None &'.HVX.J< tensor-safeA 2D Convolutional layer<=<= None &'.HVX0:@ tensor-safeKA classic Dense, or FullyConnected, layer with input and output parameters.@A@A None &'.HVX1®D tensor-safe,A Dropout layer with rate and seed argumentsDD None3>G tensor-safeGFlattens the dimensions of the shapes to a list of values with shape D1GG None4&J tensor-safeA GlobalAvgPooling2D functionJJNone5®M tensor-safeEInputs the dimensions of the shapes to a list of values with shape D1MMNone &'.HVX7²P tensor-safeRA LSTM layer with a number of units and a option to return the original sequences.PQPQNone &'.HVX9RT tensor-safe7A 2D MaxPooling pooling that works for D2 and D3 shapesTUTUNone:6X tensor-safeA ReLu activation functionXXNone;[ tensor-safeA Sigmoid activation function[[None &'.HVX<â^ tensor-safeBA ZeroPadding2D layer with padding_rows and padding_cols arguments^^ None= 58<@DGJMPTX[^ 58<@DGJMPTX[^None &'.1>HUVI•a tensor-safeSame as ShapeEquals, which compares two Shapes at kinds level, but raises a TypeError exception if the Shapes are not the equal.b tensor-safe:Compares two Shapes at kinds level and returns a Bool kindc tensor-safe%Concrete data structures for a Shape.yAll shapes are held in contiguous memory. 3D is held in a matrix (usually row oriented) which has height depth * rows.g tensor-safeqThe current shapes we accept. at the moment this is just one, two, and three dimensional Vectors/Matricies.,These are only used with DataKinds, as Kind g, with Types 'D1, 'D2, 'D3.h tensor-safeOne dimensional vectori tensor-safe$Two dimensional matrix. Row, Column.j tensor-safe0Three dimensional matrix. Row, Column, Channels.¥¦§¨abcfedgjih gjihcfedbaNone&'.=>?@AHUVXpÓo tensor-safeÿInstanciates a Network after defining a type definition, using MkINetworkUnconstrained or MkINetwork, for example. After defining a variable with INetwork type, you can instanciate that variable like this: ``` myNet :: MNIST myNet = mkINetwork ```p tensor-safe"Makes a valid instance of INetworkq tensor-safesDefines the expected output of a layer This type function should be instanciated for each of the Layers defined.t tensor-safeWCreates an INetwork type validating the the expected output and the computed one match.u tensor-safeèIf the second type argument is 'True, then it returns the type t, otherwise it returns a default type. Note that for this example, ValidateOutput would raise an exception if the expected output and the actual one do not match.v tensor-safedCreates an INetwork type, and by "unconstrained" I mean that I don't check for an expected outputw tensor-safe=Compares the layers shape computation and the expected outputx tensor-safeCSame than ComposeOut' but the Shape list includes the initial Shapey tensor-safeÿ*Returns a list of shapes describing ALL the transformations applied to a specific shape. Given a list of layers return a type with all the Shapes from the initial Shape until the last one. In theory, the last Shape should be the same than the ComputeOut function applied to this same parameters.z tensor-safeÞReturns the result of applying all the layers transformation to a specific shape. Given a list of layers, this returns the expected output for the computation of each layer starting with the first layer transforming the gq s. For example, if the initial Shape is [28, 28] and the layers are [Relu, Flatten], the result will be [784].{ tensor-safeÿ$A network that defines a specific sequence of layers with the corresponding shape transformation along the network. It's an Instance of a Network: given a Network and a initial Shape, this type structure can be generated automatically using the type functions defined in this module, like q and t.~ tensor-safe4A network that defines a specific sequence of layers tensor-safe¥Compilation: Gets the initial shape using Singleton instances. Since this is the function we run for transforming an INetwork to CNetwork, the nested argument of ‚ is set to False.‚ tensor-safeHelper function for ‰ tensor-safeEThis instance of INetwork as a Layer makes possible nesting INetworksopqrstuvwxyz{}|~€‚~€{}|zyxwvutsrqop‚}5€5None.Xqç  pt{ {tpNone&'.tÙŠ tensor-safeSimple network example‹ tensor-safeSimple network exampleŒ tensor-safeSimple network example tensor-safeSimple LSTM network exampleŠ‹ŒŠ‹ŒNone&'.uOŽ‘’‘Ž’None&'.w¥“ tensor-safe/MNIST implementation using Convolutional layers” tensor-safe,MNIST implementation using just Dense layers“”“”None.Xz}• tensor-safe&Puts simple examples results to stdout– tensor-safe%Puts MNIST examples results to stdout— tensor-safe+Puts MNIST Dense examples results to stdout•–—–—•None{µ˜ tensor-safe-Outputs to stdout the results of the examples˜˜©!"#$%&'(())*+,-./0123456789:;<=>?@ABCDEFGGHHIJKLMNO#PQRSTTUV W W X Y Z Z [ \ ] ^ _ ` a b c d efghiijkllmnopqrstuvwxyz{|}~€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§¨©ª«¬­®¯°±²³´µ¶·¸¹º»¼½¾¿ÀÁÂ*tensor-safe-0.1.0.1-9xlFTeWlZRAFx4JetlnDCHTensorSafe.Commands.UtilsTensorSafe.Commands.CompileTensorSafe.Commands.CheckTensorSafe.Compile.ExprTensorSafe.CoreTensorSafe.LayerTensorSafe.Layers.Add$TensorSafe.Layers.BatchNormalizationTensorSafe.Layers.Conv2DTensorSafe.Layers.DenseTensorSafe.Layers.DropoutTensorSafe.Layers.Flatten$TensorSafe.Layers.GlobalAvgPooling2DTensorSafe.Layers.InputTensorSafe.Layers.LSTMTensorSafe.Layers.MaxPoolingTensorSafe.Layers.ReluTensorSafe.Layers.SigmoidTensorSafe.Layers.ZeroPadding2DTensorSafe.ShapeTensorSafe.Network!TensorSafe.Examples.SimpleExample#TensorSafe.Examples.ResNet50Example TensorSafe.Examples.MnistExampleTensorSafe.Examples.ExamplesTensorSafe.Commands.ExamplesPaths_tensor_safeLanguage.Haskell Interpreter TensorSafeLayersTensorSafe.Layers errorStringsaycompilecheck Generatorgenerate generateFilePython JavaScriptCNetwork CNSequenceCNAddCNConsCNLayerCNReturnCNNilDLayer DActivationDAddDBatchNormalizationDConv2DDDenseDDropoutDFlattenDGlobalAvgPooling2DDInputDLSTM DMaxPoolingDReluDZeroPadding2D$fLayerGeneratorPython$fLayerGeneratorJavaScript$fGeneratorPython$fGeneratorJavaScript $fShowDLayer$fShowCNetwork$fShowJavaScript $fShowPythonLR TypeEquals' TypeEquals ShapeProduct$fShowR$fShowLLayerlayer InputShapeAdd $fLayerAdd $fShowAddBatchNormalization$fLayerBatchNormalization$fShowBatchNormalizationConv2D $fLayerConv2D $fShowConv2DDense $fLayerDense $fShowDenseDropout$fLayerDropout $fShowDropoutFlatten$fLayerFlatten $fShowFlattenGlobalAvgPooling2D$fLayerGlobalAvgPooling2D$fShowGlobalAvgPooling2DInput $fLayerInput $fShowInputLSTM $fLayerLSTM $fShowLSTM MaxPooling$fLayerMaxPooling$fShowMaxPoolingRelu $fLayerRelu $fShowReluSigmoid$fLayerSigmoid $fShowSigmoid ZeroPadding2D$fLayerZeroPadding2D$fShowZeroPadding2D ShapeEquals' ShapeEqualsSS1DS2DS3DShapeD1D2D3$fSingIShapeD3$fSingIShapeD2$fSingIShapeD1$fShowS ValidNetwork mkINetworkOutAdd' MaybeShape MkINetwork MaybeTypeMkINetworkUnconstrainedValidateOutput ComposeOut ComposeOut' ComputeOutINetworkINNil:~>NetworkNNil:~~ toCNetwork toCNetwork' $fShowNetwork$fShowNetwork0$fShowINetwork$fShowINetwork0$fValidNetwork::$fValidNetwork[]:$fLayerINetworkmyNetmyNet2myNet3lstmResNet50 ConvBlockShortcut IdentityBlockresnet50mnist mnistDense simpleExample mnistExamplemnistExampleDenseexamplesversion getBinDir getLibDir getDynLibDir getDataDir getLibexecDir getSysconfDirgetDataFileNamecheckAndCompileLayerGenerator paramsToJSparamsToPython'singletons-2.5.1-BL4pInogOF19UTqzwig2DuData.Singletons.InternalSingD1SingD3SingD2Sing