class Num::NN::EluLayer(T)

Overview

Exponential Linear Unit or its widely known name ELU is a function that tend to converge cost to zero faster and produce more accurate results. Different to other activation functions, ELU has a extra alpha constant which should be positive number.

ELU is very similiar to RELU except negative inputs. They are both in identity function form for non-negative inputs. On the other hand, ELU becomes smooth slowly until its output equal to -α whereas RELU sharply smoothes.

Defined in:

nn/layers/elu.cr

Constructors

Instance Method Summary

Instance methods inherited from class Num::NN::Layer(T)

forward(input : Num::Grad::Variable(T)) forward, output_shape : Array(Int32) output_shape, variables : Array(Num::Grad::Variable(T)) variables

Constructor Detail

def self.new(context : Num::Grad::Context(T), output_shape : Array(Int32), alpha : Float32 | Float64 = 0.01) #

Initializes an ELU activation layer as part of a Num::NN::Network

Arguments


[View source]

Instance Method Detail

def forward(input : Num::Grad::Variable(T)) : Num::Grad::Variable(T) #

Computes a forward pass through an ELU layer.

Arguments


[View source]
def output_shape : Array(Int32) #

[View source]