neural: Neural Networks in native Haskell

[ library, machine-learning, mit, program ] [ Propose Tags ]

The goal of neural is to provide a modular and flexible neural network library written in native Haskell.

Features include

  • composability via Arrow instances and pipes,

  • automatic differentiation for automatic gradient descent/ backpropagation training (using Edward Kmett's fabulous ad library).

The idea is to be able to easily define new components and wire them up in flexible, possibly complicated ways (convolutional deep networks etc.).

Two examples are included as proof of concept:

  • A simple neural network that approximates the sqrt function on [0,4].

  • A slightly more complicated neural network that solves the famous Iris flower problem.

The library is still very much experimental at this point.


[Skip to Readme]

Downloads

Note: This package has metadata revisions in the cabal description newer than included in the tarball. To unpack the package including the revisions, use 'cabal get'.

Maintainer's Corner

Package maintainers

For package maintainers and hackage trustees

Candidates

Versions [RSS] 0.1.0.0, 0.1.0.1, 0.1.1.0, 0.2.0.0, 0.3.0.0, 0.3.0.1
Dependencies ad (>=4.3.2 && <4.4), array (>=0.5.1.0 && <0.6), attoparsec (>=0.13.0.1 && <0.14), base (>=4.7 && <5), deepseq (>=1.4.1.1 && <1.5), directory (>=1.2.2.0 && <1.3), filepath (>=1.4.0.0 && <1.5), ghc-typelits-natnormalise (>=0.4.1 && <0.5), hspec (>=2.2.2 && <2.3), lens (>=4.13 && <4.14), MonadRandom (>=0.4.2.2 && <0.5), mtl (>=2.2.1 && <2.3), neural (>=0.1.1.0 && <0.2), parallel (>=3.2.1.0 && <3.3), pipes (>=4.1.8 && <4.2), profunctors (>=5.2 && <5.3), STMonadTrans (>=0.3.3 && <0.4), text (>=1.2.2.1 && <1.3), transformers (>=0.4.2.0 && <0.5), typelits-witnesses (>=0.2.0.0 && <0.3), vector (>=0.11.0.0 && <0.12) [details]
License MIT
Copyright Copyright: (c) 2016 Lars Bruenjes
Author Lars Bruenjes
Maintainer brunjlar@gmail.com
Revised Revision 1 made by lbrunjes at 2016-06-13T09:34:40Z
Category Machine Learning
Home page https://github.com/brunjlar/neural
Bug tracker https://github.com/brunjlar/neural/issues
Source repo head: git clone https://github.com/brunjlar/neural.git
this: git clone https://github.com/brunjlar/neural.git(tag 0.1.1.0)
Uploaded by lbrunjes at 2016-06-08T20:44:14Z
Distributions
Executables sqrt, iris
Downloads 4554 total (17 in the last 30 days)
Rating 2.0 (votes: 1) [estimated by Bayesian average]
Your Rating
  • λ
  • λ
  • λ
Status Docs uploaded by user
Build status unknown [no reports yet]

Readme for neural-0.1.1.0

[back to package description]

neural - Neural Nets in native Haskell

Build Status

The goal of this project is to provide a flexible framework for neural networks (and similar parameterized models) in Haskell.

There are already a couple of neural network libraries out there on Hackage, but as far as I can tell, they either

  • are wrappers for an engine written in another language or
  • offer a limitted choice of network architectures, training algorithms or error functions or are not easily extensible.

The goal of this library is to have an implementation in native Haskell (reasonably efficient) which offers maximal flexibility.

Furthermore, gradient descent/backpropagation should work automatically, using automatic differentiation. This means that new and complicated activation functions and/or network architectures can be used without the need to first calculate derivatives by hand.

In order to provide a powerful and flexible API, models are constructed using components which implement the Arrow and ArrowChoice typeclasses. They can therefore easily be combined and transformed, using a multitude of available combinators or arrow notation.

Even though neural networks are the primary motivation for this project, any other kind of model can be defined in the same framework, whenever the model depends on a collection of numerical parameters in a differentiable way. - One simple example for this would be linear regression.