goal-geometry-0.20: The basic geometric type system of Goal
Safe HaskellNone
LanguageHaskell2010

Goal.Geometry.Differential

Description

Tools for modelling the differential and Riemannian geometry of a Manifold.

Synopsis

Riemannian Manifolds

class (Primal c, Manifold x) => Riemannian c x where Source #

Riemannian Manifolds are differentiable Manifolds associated with a smoothly varying Tensor known as the Riemannian metric. flat and sharp correspond to applying this metric to elements of the Primal and Dual spaces, respectively.

Minimal complete definition

metric

Methods

metric :: (c # x) -> c #* Tensor x x Source #

flat :: (c # x) -> (c # x) -> c #* x Source #

sharp :: (c # x) -> (c #* x) -> c # x Source #

euclideanDistance :: Manifold x => (c # x) -> (c # x) -> Double Source #

Distance between two Points based on the Euclidean metric (l2 distance).

Backpropagation

class Map c f y x => Propagate c f y x where Source #

A class of Maps which can propagate errors. That is, given an error derivative on the output, the input which caused the output, and a Map to derive, return the derivative of the error with respect to the parameters of the Map, as well as the output of the Map.

Methods

propagate Source #

Arguments

:: [c #* y]

The error differential

-> [c #* x]

A vector of inputs

-> (c # f y x)

The function to differentiate

-> (c #* f y x, [c # y])

The derivative, and function output

Instances

Instances details
(Bilinear Tensor y x, Primal c) => Propagate c Tensor y x Source # 
Instance details

Defined in Goal.Geometry.Differential

Methods

propagate :: [c #* y] -> [c #* x] -> (c # Tensor y x) -> (c #* Tensor y x, [c # y]) Source #

(Translation z y, Map c (Affine f y) z x, Propagate c f y x) => Propagate c (Affine f y) z x Source # 
Instance details

Defined in Goal.Geometry.Differential

Methods

propagate :: [c #* z] -> [c #* x] -> (c # Affine f y z x) -> (c #* Affine f y z x, [c # z]) Source #

(Propagate c f z y, Propagate c (NeuralNetwork gys g) y x, Map c f y z, Transition c (Dual c) y, Legendre y, Riemannian c y, Bilinear f z y) => Propagate c (NeuralNetwork ('(g, y) ': gys) f) z x Source # 
Instance details

Defined in Goal.Geometry.Map.NeuralNetwork

Methods

propagate :: [c #* z] -> [c #* x] -> (c # NeuralNetwork ('(g, y) ': gys) f z x) -> (c #* NeuralNetwork ('(g, y) ': gys) f z x, [c # z]) Source #

Propagate c f z x => Propagate c (NeuralNetwork ('[] :: [(Type -> Type -> Type, Type)]) f) z x Source # 
Instance details

Defined in Goal.Geometry.Map.NeuralNetwork

Methods

propagate :: [c #* z] -> [c #* x] -> (c # NeuralNetwork '[] f z x) -> (c #* NeuralNetwork '[] f z x, [c # z]) Source #

KnownConvolutional rd r c z x => Propagate a (Convolutional rd r c) z x Source # 
Instance details

Defined in Goal.Geometry.Map.Linear.Convolutional

Methods

propagate :: [a #* z] -> [a #* x] -> (a # Convolutional rd r c z x) -> (a #* Convolutional rd r c z x, [a # z]) Source #

backpropagation :: Propagate c f y x => (a -> (c # y) -> c #* y) -> [(a, c #* x)] -> (c # f y x) -> c #* f y x Source #

An implementation of backpropagation using the Propagate class. The first argument is a function which takes a generalized target output and function output and returns an error. The second argument is a list of target outputs and function inputs. The third argument is the parameteric function to be optimized, and its differential is what is returned.

Legendre Manifolds

type family PotentialCoordinates x :: Type Source #

The (natural) coordinates of the given Manifold, on which the potential is defined.

Instances

Instances details
type PotentialCoordinates (x, y) Source # 
Instance details

Defined in Goal.Geometry.Differential

type PotentialCoordinates (Replicated k x) Source # 
Instance details

Defined in Goal.Geometry.Differential

class (Primal (PotentialCoordinates x), Manifold x) => Legendre x where Source #

Although convex analysis is usually developed seperately from differential geometry, it arises naturally out of the theory of dually flat Manifolds (Amari and Nagaoka, 2000).

A Manifold is Legendre if it is associated with a particular convex function known as a potential.

Instances

Instances details
(Legendre x, Legendre y, PotentialCoordinates x ~ PotentialCoordinates y) => Legendre (x, y) Source # 
Instance details

Defined in Goal.Geometry.Differential

Methods

potential :: (PotentialCoordinates (x, y) # (x, y)) -> Double Source #

(Legendre x, KnownNat k) => Legendre (Replicated k x) Source # 
Instance details

Defined in Goal.Geometry.Differential

class Legendre x => DuallyFlat x where Source #

A Manifold is DuallyFlat when we can describe the dualPotential, which is the convex conjugate of potential.

Instances

Instances details
(DuallyFlat x, KnownNat k) => DuallyFlat (Replicated k x) Source # 
Instance details

Defined in Goal.Geometry.Differential

canonicalDivergence :: DuallyFlat x => (PotentialCoordinates x # x) -> (PotentialCoordinates x #* x) -> Double Source #

Computes the canonicalDivergence between two points. Note that relative to the typical definition of the KL-Divergence/relative entropy, the arguments of this function are flipped.

Automatic Differentiation

differential :: Manifold x => (forall a. RealFloat a => Vector (Dimension x) a -> a) -> (c # x) -> c #* x Source #

Computes the differential of a function of the coordinates at a point using automatic differentiation.

hessian Source #

Arguments

:: Manifold x 
=> (forall a. RealFloat a => Vector (Dimension x) a -> a) 
-> (c # x) 
-> c #* Tensor x x

The Hessian

Computes the Hessian of a function at a point with automatic differentiation.