som-7.2.2: Self-Organising Maps

Portability portable experimental amy@nualeargais.ie Safe-Inferred

Data.Datamining.Clustering.SOMInternal

Description

A module containing private `SOM` internals. Most developers should use `SOM` instead. This module is subject to change without notice.

Synopsis

# Documentation

class LearningFunction f whereSource

A function used to adjust the models in a classifier.

Associated Types

type LearningRate f Source

Methods

rate :: f -> LearningRate f -> LearningRate f -> LearningRate fSource

`rate f t d` returns the learning rate for a node. The parameter `f` is the learning function. The parameter `t` indicates how many patterns (or pattern batches) have previously been presented to the classifier. Typically this is used to make the learning rate decay over time. The parameter `d` is the grid distance from the node being updated to the BMU (Best Matching Unit). The output is the learning rate for that node (the amount by which the node's model should be updated to match the target). The learning rate should be between zero and one.

Instances

 Fractional a => LearningFunction (ConstantFunction a) (Fractional a, Eq a) => LearningFunction (StepFunction a) (Floating a, Fractional a, Num a) => LearningFunction (DecayingGaussian a)

A typical learning function for classifiers. `DecayingGaussian r0 rf w0 wf tf` returns a bell curve-shaped function. At time zero, the maximum learning rate (applied to the BMU) is `r0`, and the neighbourhood width is `w0`. Over time the bell curve shrinks and the learning rate tapers off, until at time `tf`, the maximum learning rate (applied to the BMU) is `rf`, and the neighbourhood width is `wf`. Normally the parameters should be chosen such that:

• 0 < rf << r0 < 1
• 0 < wf << w0
• 0 < tf

where << means is much smaller than (not the Haskell `<<` operator!)

Constructors

 DecayingGaussian a a a a a

Instances

 Eq a => Eq (DecayingGaussian a) Show a => Show (DecayingGaussian a) Generic (DecayingGaussian a) (Floating a, Fractional a, Num a) => LearningFunction (DecayingGaussian a)

data StepFunction a Source

A learning function that only updates the BMU and has a constant learning rate.

Constructors

 StepFunction a

Instances

 Eq a => Eq (StepFunction a) Show a => Show (StepFunction a) Generic (StepFunction a) (Fractional a, Eq a) => LearningFunction (StepFunction a)

A learning function that updates all nodes with the same, constant learning rate. This can be useful for testing.

Constructors

 ConstantFunction a

Instances

 Eq a => Eq (ConstantFunction a) Show a => Show (ConstantFunction a) Generic (ConstantFunction a) Fractional a => LearningFunction (ConstantFunction a)

data SOM f t gm k p Source

A Self-Organising Map (SOM).

Although `SOM` implements `GridMap`, most users will only need the interface provided by `Data.Datamining.Clustering.Classifier`. If you chose to use the `GridMap` functions, please note:

1. The functions `adjust`, and `adjustWithKey` do not increment the counter. You can do so manually with `incrementCounter`.
2. The functions `map` and `mapWithKey` are not implemented (they just return an `error`). It would be problematic to implement them because the input SOM and the output SOM would have to have the same `Metric` type.

Constructors

 SOM FieldsgridMap :: gm pMaps patterns to tiles in a regular grid. In the context of a SOM, the tiles are called nodes learningFunction :: fThe function used to update the nodes. counter :: tA counter used as a time parameter. If you create the SOM with a counter value `0`, and don't directly modify it, then the counter will represent the number of patterns that this SOM has classified.

Instances

 (GridMap gm p, ~ * k (Index (BaseGrid gm p)), Pattern p, Grid (gm p), GridMap gm (Metric p), ~ * k (Index (gm p)), ~ * k (Index (BaseGrid gm (Metric p))), Ord (Metric p), LearningFunction f, ~ * (Metric p) (LearningRate f), Num (LearningRate f), Integral t) => Classifier (SOM f t gm) k p Foldable gm => Foldable (SOM f t gm k) (Foldable gm, GridMap gm p, Grid (BaseGrid gm p)) => GridMap (SOM f t gm k) p (Eq f, Eq t, Eq (gm p)) => Eq (SOM f t gm k p) (Show f, Show t, Show (gm p)) => Show (SOM f t gm k p) Generic (SOM f t gm k p) Grid (gm p) => Grid (SOM f t gm k p)

toGridMap :: GridMap gm p => SOM f t gm k p -> gm pSource

Extracts the grid and current models from the SOM. A synonym for `gridMap`.

adjustNode :: (Pattern p, Grid g, k ~ Index g, Num t) => g -> (t -> Metric p) -> p -> k -> k -> p -> pSource

trainNeighbourhood :: (Pattern p, Grid (gm p), GridMap gm p, Index (BaseGrid gm p) ~ Index (gm p), LearningFunction f, Metric p ~ LearningRate f, Num (LearningRate f), Integral t) => SOM f t gm k p -> Index (gm p) -> p -> SOM f t gm k pSource

Trains the specified node and the neighbourood around it to better match a target. Most users should use `train`, which automatically determines the BMU and trains it and its neighbourhood.

incrementCounter :: Num t => SOM f t gm k p -> SOM f t gm k pSource

justTrain :: (Ord (Metric p), Pattern p, Grid (gm p), GridMap gm (Metric p), GridMap gm p, Index (BaseGrid gm (Metric p)) ~ Index (gm p), Index (BaseGrid gm p) ~ Index (gm p), LearningFunction f, Metric p ~ LearningRate f, Num (LearningRate f), Integral t) => SOM f t gm k p -> p -> SOM f t gm k pSource