neural: Neural Networks in native Haskell

This is a package candidate release! Here you can preview how this package release will appear once published to the main package index (which can be accomplished via the 'maintain' link below). Please note that once a package has been published to the main package index it cannot be undone! Please consult the package uploading documentation for more information.

[maintain] [Publish]

The goal of neural is to provide a modular and flexible neural network library written in native Haskell.

Features include

The idea is to be able to easily define new components and wire them up in flexible, possibly complicated ways (convolutional deep networks etc.).

Three examples are included as proof of concept:

The library is still very much experimental at this point.


[Skip to Readme]

Properties

Versions 0.1.0.0, 0.1.0.1, 0.1.1.0, 0.2.0.0, 0.2.0.0, 0.3.0.0, 0.3.0.1
Change log None available
Dependencies ad (>=4.3.2 && <4.4), array (>=0.5.1.0 && <0.6), attoparsec (>=0.13.0.1 && <0.14), base (>=4.7 && <5), bytestring (>=0.10.6.0 && <0.11), deepseq (>=1.4.1.1 && <1.5), directory (>=1.2.2.0 && <1.3), filepath (>=1.4.0.0 && <1.5), ghc-typelits-natnormalise (>=0.4.1 && <0.5), hspec (>=2.2.2 && <2.3), JuicyPixels (>=3.2.7 && <3.3), kan-extensions (>=4.2.3 && <4.3), lens (>=4.13 && <4.14), monad-par (>=0.3.4.7 && <0.4), monad-par-extras (>=0.3.3 && <0.4), MonadRandom (>=0.4.2.2 && <0.5), mtl (>=2.2.1 && <2.3), neural (>=0.2.0.0 && <0.3), parallel (>=3.2.1.0 && <3.3), pipes (>=4.1.8 && <4.2), pipes-bytestring (>=2.1.1 && <2.2), pipes-safe (>=2.2.3 && <2.3), pipes-zlib (>=0.4.4 && <0.5), profunctors (>=5.2 && <5.3), reflection (>=2.1.2 && <2.2), STMonadTrans (>=0.3.3 && <0.4), text (>=1.2.2.1 && <1.3), transformers (>=0.4.2.0 && <0.5), typelits-witnesses (>=0.2.0.0 && <0.3), vector (>=0.11.0.0 && <0.12) [details]
License MIT
Copyright Copyright: (c) 2016 Lars Bruenjes
Author Lars Bruenjes
Maintainer brunjlar@gmail.com
Category Machine Learning
Home page https://github.com/brunjlar/neural
Bug tracker https://github.com/brunjlar/neural/issues
Source repo head: git clone https://github.com/brunjlar/neural.git
this: git clone https://github.com/brunjlar/neural.git(tag 0.1.1.0)
Uploaded by lbrunjes at 2016-06-15T23:51:50Z

Modules

Downloads

Maintainer's Corner

Package maintainers

For package maintainers and hackage trustees


Readme for neural-0.2.0.0

[back to package description]

neural - Neural Nets in native Haskell

Build Status

Motivation

The goal of this project is to provide a flexible framework for neural networks (and similar parameterized models) in Haskell.

There are already a couple of neural network libraries out there on Hackage, but as far as I can tell, they either

The goal of this library is to have an implementation in native Haskell (reasonably efficient) which offers maximal flexibility.

Furthermore, gradient descent/backpropagation should work automatically, using automatic differentiation. This means that new and complicated activation functions and/or network architectures can be used without the need to first calculate derivatives by hand.

In order to provide a powerful and flexible API, models are constructed using components which behave as if they implemented the Arrow and ArrowChoice typeclasses. They can therefore easily be combined and transformed.

Once a model has been constructed, it can be hooked up into a customized training algorithm using pipes, so that various aspects of the algorithm (loading data, choosing random samples, reporting intermediate results, stop criterium etc.) can be defined in a modular, decoupled way.

Even though neural networks are the primary motivation for this project, any other kind of model can be defined in the same framework, whenever the model depends on a collection of numerical parameters in a differentiable way. - One simple example for this would be linear regression.

Examples

At the moment, three examples are included: