hasktorch-indef-0.0.1.0: Core Hasktorch abstractions wrapping FFI bindings

Copyright(c) Sam Stites 2017
LicenseBSD3
Maintainersam@stites.io
Stabilityexperimental
Portabilitynon-portable
Safe HaskellNone
LanguageHaskell2010

Torch.Indef.Dynamic.NN.Activation

Description

DYNAMIC-NN MODULE WARNING: this module is mostly unfinished and undocumented. It provides, in essence, direct calls to the torch neural network libraries: THNN and THCUNN. Because the dynamic tensor code requires a lot of runtime checks which requires a lot of thought regarding a good API, the recommended route is to use Static tensors, which have a much more natural API and is inherently safer.

Synopsis

Documentation

_threshold_updateOutput :: Dynamic -> Dynamic -> Double -> Double -> Bool -> IO () Source #

threshold forward pass (updates the output tensor)

_threshold_updateGradInput :: Dynamic -> Dynamic -> Dynamic -> Double -> Double -> Bool -> IO () Source #

threshold backward-update (updates the layer and bias tensors)

_pReLU_updateOutput :: Dynamic -> Dynamic -> Dynamic -> IO () Source #

pReLU forward pass (updates the output tensor)

_pReLU_updateGradInput :: Dynamic -> Dynamic -> Dynamic -> Dynamic -> IO () Source #

pReLU backward-update (updates the layer and bias tensors)

_pReLU_accGradParameters :: Dynamic -> Dynamic -> Dynamic -> Dynamic -> Dynamic -> Double -> IO () Source #

pReLU backward-update (updates the layer and bias tensors). Called accGradParameters in C to indicate accumulating the gradient parameters.

_rReLU_updateOutput :: Dynamic -> Dynamic -> Dynamic -> Double -> Double -> Bool -> Bool -> Generator -> IO () Source #

rReLU forward pass (updates the output tensor)

_rReLU_updateGradInput :: Dynamic -> Dynamic -> Dynamic -> Dynamic -> Double -> Double -> Bool -> Bool -> IO () Source #

rReLU backward-update (updates the layer and bias tensors)

_leakyReLU_updateOutput :: Dynamic -> Dynamic -> Double -> Bool -> IO () Source #

leakyReLU forward pass (updates the output tensor)

_leakyReLU_updateGradInput :: Dynamic -> Dynamic -> Dynamic -> Double -> Bool -> IO () Source #

leakyReLU backward-update (updates the layer and bias tensors)

_eLU_updateOutput :: Dynamic -> Dynamic -> Double -> Double -> Bool -> IO () Source #

eLU forward pass (updates the output tensor)

_eLU_updateGradInput :: Dynamic -> Dynamic -> Dynamic -> Double -> Double -> IO () Source #

eLU backward-update (updates the layer and bias tensors)