class Num::NN::DropoutLayer(T)

Overview

Dilution (also called Dropout) is a regularization technique for reducing overfitting in artificial neural networks by preventing complex co-adaptations on training data. It is an efficient way of performing model averaging with neural networks. The term dilution refers to the thinning of the weights. The term dropout refers to randomly "dropping out", or omitting, units (both hidden and visible) during the training process of a neural network. Both the thinning of weights and dropping out units trigger the same type of regularization, and often the term dropout is used when referring to the dilution of weights.

Defined in:

nn/layers/dropout.cr

Constructors

Instance Method Summary

Instance methods inherited from class Num::NN::Layer(T)

forward(input : Num::Grad::Variable(T)) forward, output_shape : Array(Int32) output_shape, variables : Array(Num::Grad::Variable(T)) variables

Constructor Detail

def self.new(context : Num::Grad::Context(T), output_shape : Array(Int32), prob = 0.5_f32) #

Initialize a dropout layer in a Num::NN::Network(T)

Arguments

  • context : Num::Grad::Context(T) - Context associated with the network, used only for determining generic type.
  • output_shape : Array(Int32) - Cached output shape
  • prob : Float32 - Probability of dropping out a value when performing a forward pass

[View source]

Instance Method Detail

def forward(input : Num::Grad::Variable(T)) : Num::Grad::Variable(T) #

Computes the forward pass of a Num::NN::Network. This will remove a certain amount of neurons from the input variable, and scale the remaining values by the probability of removal.

Arguments


[View source]
def output_shape : Array(Int32) #

[View source]