| Copyright | (c) Huw Campbell 2016-2017 |
|---|---|
| License | BSD2 |
| Stability | experimental |
| Safe Haskell | None |
| Language | Haskell98 |
Grenade.Layers.Relu
Description
Documentation
A rectifying linear unit. A layer which can act between any shape of the same dimension, acting as a diode on every neuron individually.
Constructors
| Relu |
Instances
| Show Relu Source # | |
| Serialize Relu Source # | |
| UpdateLayer Relu Source # | |
| KnownNat i => Layer Relu (D1 i) (D1 i) Source # | |
| (KnownNat i, KnownNat j) => Layer Relu (D2 i j) (D2 i j) Source # | |
| (KnownNat i, KnownNat j, KnownNat k) => Layer Relu (D3 i j k) (D3 i j k) Source # | |
| type Gradient Relu Source # | |
| type Tape Relu (D1 i) (D1 i) Source # | |
| type Tape Relu (D2 i j) (D2 i j) Source # | |
| type Tape Relu (D3 i j k) (D3 i j k) Source # | |