class Num::NN::EluLayer(T)
- Num::NN::EluLayer(T)
- Num::NN::Layer(T)
- Reference
- Object
Overview
Exponential Linear Unit or its widely known name ELU is a function that tend to converge cost to zero faster and produce more accurate results. Different to other activation functions, ELU has a extra alpha constant which should be positive number.
ELU is very similiar to RELU except negative inputs. They are both in identity function form for non-negative inputs. On the other hand, ELU becomes smooth slowly until its output equal to -α whereas RELU sharply smoothes.
Defined in:
nn/layers/elu.crConstructors
-
.new(context : Num::Grad::Context(T), output_shape : Array(Int32), alpha : Float32 | Float64 = 0.01)
Initializes an ELU activation layer as part of a
Num::NN::Network
Instance Method Summary
-
#forward(input : Num::Grad::Variable(T)) : Num::Grad::Variable(T)
Computes a forward pass through an ELU layer.
- #output_shape : Array(Int32)
Instance methods inherited from class Num::NN::Layer(T)
forward(input : Num::Grad::Variable(T))
forward,
output_shape : Array(Int32)
output_shape,
variables : Array(Num::Grad::Variable(T))
variables
Constructor Detail
def self.new(context : Num::Grad::Context(T), output_shape : Array(Int32), alpha : Float32 | Float64 = 0.01)
#
Initializes an ELU activation layer as part of a Num::NN::Network
Arguments
- context :
Num::Grad::Context(T)
- Context of theNum::NN::Network
, used only to determine generic type of theNum::NN::Layer(T)
- output_shape :
Array(Int32)
- The shape of the output of the layer - alpha :
Float
- Scale for the negative factor
Instance Method Detail
Computes a forward pass through an ELU layer.
Arguments
- input :
Num::Grad::Variable(T)
- Variable to activate