som-7.3.0: Self-Organising Maps.

Copyright(c) Amy de Buitléir 2012-2014
Safe HaskellSafe-Inferred




A Simplified Self-organising Map (SSOM). An SSOM maps input patterns onto a set, where each element in the set is a model of the input data. An SSOM is like a Kohonen Self-organising Map (SOM), except that instead of a grid, it uses a simple set of unconnected models. Since the models are unconnected, only the model that best matches the input is ever updated. This makes it faster, however, topological relationships within the input data are not preserved. This implementation supports the use of non-numeric patterns.

In layman's terms, a SSOM can be useful when you you want to build a set of models on some data. A tutorial is available at


  • de Buitléir, Amy, Russell, Michael and Daly, Mark. (2012). Wains: A pattern-seeking artificial life species. Artificial Life, 18 (4), 399-423.
  • Kohonen, T. (1982). Self-organized formation of topologically correct feature maps. Biological Cybernetics, 43 (1), 59–69.



data SSOM f t k p Source

A Simplified Self-Organising Map (SSOM).




sMap :: Map k p

Maps patterns to nodes.

learningFunction :: f

The function used to update the nodes.

counter :: t

A counter used as a "time" parameter. If you create the SSOM with a counter value 0, and don't directly modify it, then the counter will represent the number of patterns that this SSOM has classified.


(Pattern p, Ord (Metric p), LearningFunction f, (~) * (Metric p) (LearningRate f), Num (LearningRate f), Ord k, Integral t) => Classifier (SSOM f t) k p 
(Eq f, Eq t, Eq k, Eq p) => Eq (SSOM f t k p) 
(Show f, Show t, Show k, Show p) => Show (SSOM f t k p) 
Generic (SSOM f t k p) 
type Rep (SSOM f t k p) 

data Gaussian a Source

A typical learning function for classifiers. Gaussian r0 rf tf returns a gaussian function. At time zero, the learning rate is r0. Over time the learning rate tapers off, until at time tf, the learning rate is rf. Normally the parameters should be chosen such that:

  • 0 < rf << r0 < 1
  • 0 < tf

where << means "is much smaller than" (not the Haskell << operator!)


Gaussian a a a 


Eq a => Eq (Gaussian a) 
Show a => Show (Gaussian a) 
Generic (Gaussian a) 
(Floating a, Fractional a, Num a) => LearningFunction (Gaussian a) 
type Rep (Gaussian a) 
type LearningRate (Gaussian a) = a 


toMap :: SSOM f t k p -> Map k p Source

Extracts the current models from the SSOM. A synonym for sMap.

Advanced control

trainNode :: (Pattern p, LearningFunction f, Metric p ~ LearningRate f, Num (LearningRate f), Ord k, Integral t) => SSOM f t k p -> k -> p -> SSOM f t k p Source

Trains the specified node and the neighbourood around it to better match a target. Most users should use train, which automatically determines the BMU and trains it and its neighbourhood.