class Num::NN::AdamOptimizer(T)
- Num::NN::AdamOptimizer(T)
- Num::NN::Optimizer(T)
- Reference
- Object
Overview
Adam (short for Adaptive Moment Estimation) is an update to the RMSProp optimizer. In this optimization algorithm, running averages of both the gradients and the second moments of the gradients are used.
Defined in:
nn/optimizers/adam.crConstructors
-
.new(learning_rate : Float64 = 0.001, beta1 : Float64 = 0.9, beta2 : Float64 = 0.999, epsilon : Float64 = 1e-8)
Initializes an Adam optimizer, disconnected from a network.
Instance Method Summary
-
#build_params(l : Array(Layer(T)))
Adds variables from a
Num::NN::Network
to the optimizer, to be tracked and updated after each forward pass through a network. -
#update
Updates all
Num::Grad::Variable
s registered to the optimizer based on weights present in the network and the parameters of the optimizer.
Instance methods inherited from class Num::NN::Optimizer(T)
build_params(l : Array(Layer(T)))
build_params,
learning_rate : Float64
learning_rate,
params : Array(Num::Grad::Variable(T))
params,
update
update
Constructor methods inherited from class Num::NN::Optimizer(T)
new(learning_rate : Float64 = 0.01)
new
Constructor Detail
Initializes an Adam optimizer, disconnected from a network.
In order to link this optimizer to a Num::NN::Network
, calling
#build_params
will register each variable in the computational
graph with this optimizer.
Arguments
- learning_rate :
Float
- Learning rate of the optimizer - beta1 :
Float
- The exponential decay rate for the 1st moment estimates - beta2 :
Float
- The exponential decay rate for the 2nd moment estimates - epsilon :
Float
- A small constant for numerical stability
Instance Method Detail
Adds variables from a Num::NN::Network
to the optimizer,
to be tracked and updated after each forward pass through
a network.
Arguments
Updates all Num::Grad::Variable
s registered to the optimizer
based on weights present in the network and the parameters of
the optimizer. Resets all gradients to 0
.