neural-0.3.0.0: Neural Networks in native Haskell
The goal of neural is to provide a modular and flexible neural network library written in native Haskell.
Features include
- composability via arrow-like instances and pipes,
- automatic differentiation for automatic gradient descent/ backpropagation training (using Edward Kmett's fabulous ad library).
The idea is to be able to easily define new components and wire them up in flexible, possibly complicated ways (convolutional deep networks etc.).
Three examples are included as proof of concept:
- A simple neural network that approximates the sqrt function on [0,4].
- A slightly more complicated neural network that solves the famous Iris flower problem.
- A first (still simple) neural network for recognizing handwritten digits from the equally famous MNIST database.
The library is still very much experimental at this point.
Modules
- Data
- Data.MyPrelude commonly used standard types and functions
- Data.Utils various utilities
- Data.Utils.Analytic "analytic" values
- Data.Utils.Arrow arrow utilities
- Data.Utils.List list utilities
- Data.Utils.Matrix fixed-size matrices
- Data.Utils.Pipes list utilities
- Data.Utils.Random random number utilities
- Data.Utils.Stack a simple stack monad
- Data.Utils.Statistics statistical utilities
- Data.Utils.Traversable utilities for traversables
- Data.Utils.Vector fixed-length vectors
- Numeric
- Numeric.Neural neural networks
- Numeric.Neural.Layer layer components
- Numeric.Neural.Model "neural" components and models
- Numeric.Neural.Normalization normalizing data
- Numeric.Neural.Pipes a pipes API for models
- Numeric.Neural neural networks