Copyright  (c) Amy de Buitléir 20122015 

License  BSDstyle 
Maintainer  amy@nualeargais.ie 
Stability  experimental 
Portability  portable 
Safe Haskell  Safe 
Language  Haskell98 
A module containing private DSOM
internals. Most developers should
use DSOM
instead. This module is subject to change without notice.
 data DSOM gm x k p = DSOM {
 gridMap :: gm p
 learningRate :: x > x > x > x
 difference :: p > p > x
 makeSimilar :: p > x > p > p
 withGridMap :: (gm p > gm p) > DSOM gm x k p > DSOM gm x k p
 toGridMap :: GridMap gm p => DSOM gm x k p > gm p
 adjustNode :: (FiniteGrid (gm p), GridMap gm p, k ~ Index (gm p), k ~ Index (BaseGrid gm p), Ord k, Num x, Fractional x) => gm p > (p > x > p > p) > (p > p > x) > (x > x > x) > p > k > k > p > p
 scaleDistance :: (Num a, Fractional a) => Int > Int > a
 trainNeighbourhood :: (FiniteGrid (gm p), GridMap gm p, k ~ Index (gm p), k ~ Index (BaseGrid gm p), Ord k, Num x, Fractional x) => DSOM gm x t p > k > p > DSOM gm x k p
 justTrain :: (FiniteGrid (gm p), GridMap gm p, GridMap gm x, k ~ Index (gm p), k ~ Index (gm x), k ~ Index (BaseGrid gm p), k ~ Index (BaseGrid gm x), Ord k, Ord x, Num x, Fractional x) => DSOM gm x t p > p > DSOM gm x k p
 rougierLearningFunction :: (Eq a, Ord a, Floating a) => a > a > a > a > a > a
Documentation
A SelfOrganising Map (DSOM).
Although DSOM
implements GridMap
, most users will only need the
interface provided by Data.Datamining.Clustering.Classifier
. If
you chose to use the GridMap
functions, please note:
 The functions
adjust
, andadjustWithKey
do not increment the counter. You can do so manually withincrementCounter
.  The functions
map
andmapWithKey
are not implemented (they just return anerror
). It would be problematic to implement them because the input DSOM and the output DSOM would have to have the sameMetric
type.
DSOM  

(GridMap gm p, (~) * k (Index (BaseGrid gm p)), FiniteGrid (gm p), GridMap gm x, (~) * k (Index (gm p)), (~) * k (Index (gm x)), (~) * k (Index (BaseGrid gm x)), Ord k, Ord x, Num x, Fractional x) => Classifier (DSOM gm) x k p Source  
Foldable gm => Foldable (DSOM gm x k) Source  
(Foldable gm, GridMap gm p, FiniteGrid (BaseGrid gm p)) => GridMap (DSOM gm x k) p Source  
Generic (DSOM gm x k p) Source  
NFData (gm p) => NFData (DSOM gm x k p) Source  
Grid (gm p) => Grid (DSOM gm x k p) Source  
type BaseGrid (DSOM gm x k) p = BaseGrid gm p Source  
type Rep (DSOM gm x k p) Source  
type Index (DSOM gm x k p) = Index (gm p) Source  
type Direction (DSOM gm x k p) = Direction (gm p) Source 
withGridMap :: (gm p > gm p) > DSOM gm x k p > DSOM gm x k p Source
toGridMap :: GridMap gm p => DSOM gm x k p > gm p Source
Extracts the grid and current models from the DSOM.
adjustNode :: (FiniteGrid (gm p), GridMap gm p, k ~ Index (gm p), k ~ Index (BaseGrid gm p), Ord k, Num x, Fractional x) => gm p > (p > x > p > p) > (p > p > x) > (x > x > x) > p > k > k > p > p Source
scaleDistance :: (Num a, Fractional a) => Int > Int > a Source
trainNeighbourhood :: (FiniteGrid (gm p), GridMap gm p, k ~ Index (gm p), k ~ Index (BaseGrid gm p), Ord k, Num x, Fractional x) => DSOM gm x t p > k > p > DSOM gm x k p Source
Trains the specified node and the neighbourood around it to better
match a target.
Most users should use train
, which automatically determines
the BMU and trains it and its neighbourhood.
justTrain :: (FiniteGrid (gm p), GridMap gm p, GridMap gm x, k ~ Index (gm p), k ~ Index (gm x), k ~ Index (BaseGrid gm p), k ~ Index (BaseGrid gm x), Ord k, Ord x, Num x, Fractional x) => DSOM gm x t p > p > DSOM gm x k p Source
rougierLearningFunction :: (Eq a, Ord a, Floating a) => a > a > a > a > a > a Source
Configures a learning function that depends not on the time, but
on how good a model we already have for the target. If the
BMU is an exact match for the target, no learning occurs.
Usage is
, where rougierLearningFunction
r pr
is the
maximal learning rate (0 <= r <= 1), and p
is the elasticity.
NOTE: When using this learning function, ensure that
abs . difference
is always between 0 and 1, inclusive. Otherwise
you may get invalid learning rates.