tensor-safe: Create valid deep neural network architectures

[ ai, bsd3, dependent-types, language, library, program ] [ Propose Tags ]
Versions [RSS] 0.1.0.0, 0.1.0.1
Dependencies base (>=4.7 && <5), casing (>=0.1.4.0 && <0.1.5), cmdargs (>=0.10.20 && <0.11), containers (>=0.6.0.1 && <0.7), extra (>=1.6 && <1.7), formatting (>=6.3.6 && <6.4), ghc-typelits-extra (>=0.3 && <0.4), hint (>=0.9.0 && <1.0), singletons (>=2.5.1 && <2.6), tensor-safe, text (>=1.2.3.1 && <1.3), vector (>=0.12 && <0.13), vector-sized (>1.2 && <1.3) [details]
License BSD-3-Clause
Copyright 2019 Leonardo Pineyro
Author Leonardo Pineyro
Maintainer leopiney@gmail.com
Category AI, Dependent Types, Language, Library, Program
Home page https://github.com/leopiney/tensor-safe#readme
Bug tracker https://github.com/leopiney/tensor-safe/issues
Source repo head: git clone https://github.com/leopiney/tensor-safe
Uploaded by leopiney at 2019-05-03T00:15:08Z
Distributions
Executables tensor-safe
Downloads 921 total (3 in the last 30 days)
Rating 2.0 (votes: 1) [estimated by Bayesian average]
Your Rating
  • λ
  • λ
  • λ
Status Docs available [build log]
Last success reported on 2019-05-03 [all 1 reports]

Readme for tensor-safe-0.1.0.1

[back to package description]

Tensor Safe

tensor-safe is a dependently typed framework to define deep learning models which structure is verified on compilation time. If the models are valid, these can be compiled to Keras framework in Python or JavaScript.

Building instructions and development tools

  1. Install ghc-mod, hpack and stylish-haskell with stack install

    cd ~
    stack install ghc-mod hpack stylish-haskell
    
  2. Run stack build in project folder

  3. Install Intero

    Run stack build intero in the project folder

    Ref: https://gitlab.com/vannnns/haskero/blob/master/client/doc/installation.md

Generate .cabal file

Run hpack in the root of the project and the file tensor-safe.cabal will be generated

Model definition

Models can be defined as a type using the MkINetwork type function. The MkINetwork defines a valid instance of a Network model given a list of Layers and a spected input and iutput Shapes.

Here's an example of how to define a simple model for the MNIST dataset, using Dense layers:

type MNIST = MkINetwork
    '[
        Flatten,
        Dense 784 42,
        Relu,
        Dense 42 10,
        Sigmoid
    ]
    ('D3 28 28 1)    -- Input
    ('D1 10)         -- Output

After that, variable with the model type can be verified with the function mkINetwork like this:

mnist :: MNIST
mnist = mkINetwork

Nesting networks definitions

You can nest networks definitions easily by adding the networks as layers. For example, in the case of the MNIST model defined above, we can abstract the use of Dense and a activation function like this:

type DenseRelu i o =
    MkINetwork '[ Dense i o, Relu ] ('D1 i) ('D1 o)

type DenseSigmoid i o =
    MkINetwork '[ Dense i o, Sigmoid ] ('D1 i) ('D1 o)

type MNIST = MkINetwork
    '[
        Flatten,
        DenseRelu 784 42,
        DenseSigmoid 42 10
    ]
    ('D3 28 28 1)    -- Input
    ('D1 10)         -- Output

How to extend layers definitions

Since this library only implements a subset of features that Keras implement, it's likely that for new projects you'll need to add new layers. Due to the modularization of the library, this can be done by adding the layer definitions in specific locations of the project:

  1. First, add a new auxiliary layer entry for the data type DLayer in TensorSafe.Compile.Expr. This will make possible the compilation of the layer for all instances of Generator. Also, add to the LayerGenerator entry for the newly added layer.
  2. Secondly, add the layer definition to the TensorSafe/Layers folder. You can copy the definitions from the currently defined layers.
  3. Then, import and expose your layer definition in the TensorSafe.Layers module.
  4. Finally, declare how your layer transforms a specific Shape in the Out type function.

Command line interface

This interface will change in the near future

You can install tensor-safe command line tool by running stack build. Then you can use it by using stack exec tensor-safe -- check --path ./path-to-model.hs or stack exec tensor-safe -- compile --path ./path-to-model.hs --module-name SomeModule.

Tools for JavaScript environment

Add as development dependency the packages babel-plugin-tensor-safe and eslint-plugin-tensor-safe. These can be found in the extra/javascript folder in this project.

You can add them directly from this project like this:

yarn add --dev file/:<path-to-tensor-safe>/extra/javascript/babel-plugin-tensor-safe

yarn add --dev file/:<path-to-tensor-safe>/extra/javascript/eslint-plugin-tensor-safe

Then add to the .eslintrc.js file in your JavaScript project the plugin tensor-safe and the rule tensor-safe-model-invalid like this:

module.exports = {
  plugins: [
     ...
     "tensor-safe"
   ],
  ...
  rules: {
    ...
    "tensor-safe/invalid-model": 1
    ...
  }
};

And for the Babel plugin add "@babel/plugin-tensor-safe" to the plugins list in the .babelrc file inside your JavaScript project.

Then, you can write your deep learning model inside your JS files as in the following example:

function createConvModel() {
  safeModel`
    '[
        Conv2D 1 16 3 3 1 1,
        Relu,
        MaxPooling 2 2 2 2,
        Conv2D 16 32 3 3 1 1,
        Relu,
        MaxPooling 2 2 2 2,
        Conv2D 32 32 3 3 1 1,
        Relu,
        Flatten,
        Dense 288 64,
        Sigmoid,
        Dense 64 10,
        Sigmoid
    ]
    ('D3 28 28 1)  -- Input
    ('D1 10)       -- Output
`;

  return model;
}

This project was highly influenciated by Grenade 💣. Grenade is a really cool library to define deep neural networks which are validated using dependent types. What differences TensorSafe from Grenade the most is that TensorSafe doesn't run nor train the models, instead it compiles the model to external languages that are capable of performing all computations – like Keras for Python or JavaScript. Also, TensorSafe doesn't need to specifically declare all Shapes transformations for all the model layers, instead, it just needs the input and output Shapes to validate the model.

Another worth looking library is TensorFlow for Haskell. This library has all bindings for TensorFlow in C. The issue with this is that it doesn't perform a lot of type checkings at compilation time. However, there's an open branch that uses dependent types to solve many of these issues: https://github.com/helq/tensorflow-haskell-deptyped, but the solution still seems rather complicated for real use.