Copyright  (c) Amy de Buitléir 20122014 

License  BSDstyle 
Maintainer  amy@nualeargais.ie 
Stability  experimental 
Portability  portable 
Safe Haskell  SafeInferred 
Language  Haskell98 
A module containing private SSOM
internals. Most developers should
use SSOM
instead. This module is subject to change without notice.
 class LearningFunction f where
 type LearningRate f
 rate :: f > LearningRate f > LearningRate f
 data Gaussian a = Gaussian a a a
 data SSOM f t k p = SSOM {
 sMap :: Map k p
 learningFunction :: f
 counter :: t
 toMap :: SSOM f t k p > Map k p
 trainNode :: (Pattern p, LearningFunction f, Metric p ~ LearningRate f, Num (LearningRate f), Ord k, Integral t) => SSOM f t k p > k > p > SSOM f t k p
 incrementCounter :: Num t => SSOM f t k p > SSOM f t k p
 justTrain :: (Ord (Metric p), Pattern p, LearningFunction f, Metric p ~ LearningRate f, Num (LearningRate f), Ord k, Integral t) => SSOM f t k p > p > SSOM f t k p
Documentation
class LearningFunction f where Source
A function used to adjust the models in a classifier.
type LearningRate f Source
rate :: f > LearningRate f > LearningRate f Source
returns the learning rate for a node.
The parameter rate
f tf
is the learning function.
The parameter t
indicates how many patterns (or pattern
batches) have previously been presented to the classifier.
Typically this is used to make the learning rate decay over time.
The output is the learning rate for that node (the amount by
which the node's model should be updated to match the target).
The learning rate should be between zero and one.
(Floating a, Fractional a, Num a) => LearningFunction (Gaussian a) 
A typical learning function for classifiers.
returns a gaussian function. At time zero,
the learning rate is Gaussian
r0 rf tfr0
. Over time the learning rate tapers off,
until at time tf
, the learning rate is rf
. Normally the
parameters should be chosen such that:
 0 < rf << r0 < 1
 0 < tf
where << means "is much smaller than" (not the Haskell <<
operator!)
Gaussian a a a 
A Simplified SelfOrganising Map (SSOM).
SSOM  

(Pattern p, Ord (Metric p), LearningFunction f, (~) * (Metric p) (LearningRate f), Num (LearningRate f), Ord k, Integral t) => Classifier (SSOM f t) k p  
(Eq f, Eq t, Eq k, Eq p) => Eq (SSOM f t k p)  
(Show f, Show t, Show k, Show p) => Show (SSOM f t k p)  
Generic (SSOM f t k p)  
type Rep (SSOM f t k p) 
toMap :: SSOM f t k p > Map k p Source
Extracts the current models from the SSOM.
A synonym for
.sMap
trainNode :: (Pattern p, LearningFunction f, Metric p ~ LearningRate f, Num (LearningRate f), Ord k, Integral t) => SSOM f t k p > k > p > SSOM f t k p Source
Trains the specified node and the neighbourood around it to better
match a target.
Most users should use train
, which automatically determines
the BMU and trains it and its neighbourhood.
incrementCounter :: Num t => SSOM f t k p > SSOM f t k p Source
justTrain :: (Ord (Metric p), Pattern p, LearningFunction f, Metric p ~ LearningRate f, Num (LearningRate f), Ord k, Integral t) => SSOM f t k p > p > SSOM f t k p Source