class Num::Grad::Variable(T)
- Num::Grad::Variable(T)
- Reference
- Object
Overview
A variable is an abstraction of a Tensor that tracks the operations done to the Tensor. It also keeps track of the gradient of the operation if a Variable needs to backpropogate.
This is the fundamental object used in automatic differentiation, as well as the neural network aspects of Num.cr
Defined in:
grad/primitives/variable.crgrad/variable.cr
nn/layers/elu.cr
nn/layers/leaky_relu.cr
nn/layers/relu.cr
Constructors
-
.new(context : Num::Grad::Context(T), value : T, requires_grad : Bool = false)
Initialization method for a Variable.
Instance Method Summary
-
#*(other : Num::Grad::Variable(T)) : Num::Grad::Variable(T)
Multiples a variable to another variable and stores the derivative of the operation in the computational graph.
-
#**(other : Num::Grad::Variable(T)) : Num::Grad::Variable(T)
Raises a variable to another variable and stores the derivative of the operation in the computational graph.
-
#+(other : Num::Grad::Variable(T)) : Num::Grad::Variable(T)
Adds a variable to another variable and stores the derivative of the operation in the computational graph.
-
#-(other : Num::Grad::Variable(T)) : Num::Grad::Variable(T)
Subtracts a variable from another variable and stores the derivative of the operation in the computational graph.
-
#-
Negates the variable
-
#/(other : Num::Grad::Variable(T)) : Num::Grad::Variable(T)
Divides a variable by another variable and stores the derivative of the operation in the computational graph.
-
#[](*args)
Slices a variable.
-
#acos : Num::Grad::Variable(T)
Computes the arccosine of a variable
-
#asin : Num::Grad::Variable(T)
Computes the arcsine of a variable
-
#atan : Num::Grad::Variable(T)
Computes the arctangent of a variable
-
#backprop(debug : Bool = false)
Back propogates an operation along a computational graph.
-
#context : Num::Grad::Context(T)
The graph the variable is associated with.
-
#cos : Num::Grad::Variable(T)
Computes the cosine of a variable
-
#elu(alpha = 0.01)
Exponential Linear Unit activation function
-
#exp : Num::Grad::Variable(T)
Computes the exp of a variable
-
#grad : T
The gradient of the Variable.
-
#grad=(grad : T)
The gradient of the Variable.
- #leaky_relu
-
#log : Num::Grad::Variable(T)
Computes the log of a variable
-
#matmul(b : Num::Grad::Variable(T)) : Num::Grad::Variable(T)
Matrix multiply operator for two variables.
- #mean(axis : Int) : Num::Grad::Variable(T)
- #relu
-
#requires_grad : Bool
If set to true, this variable will track its operations, otherwise it will act similar to a Tensor, only calculating forward operations
-
#requires_grad=(requires_grad : Bool)
If set to true, this variable will track its operations, otherwise it will act similar to a Tensor, only calculating forward operations
-
#sin : Num::Grad::Variable(T)
Computes the sine of a variable
-
#sum(axis : Int) : Num::Grad::Variable(T)
Reduces a
Tensor
along an axis, summing each view into the variable -
#tan : Num::Grad::Variable(T)
Computes the tangent of a variable
-
#tanh : Num::Grad::Variable(T)
Computes the tanh of a variable
-
#value : T
The value of the Variable.
Constructor Detail
Initialization method for a Variable.
This method should only be called by a context, as it creates a Variable. Context provides a helper method to add a Variable to the computational graph that handles ownership of the context and other related instance variables
Instance Method Detail
Multiples a variable to another variable and stores the derivative of the operation in the computational graph.
Arguments
- other :
Num::Grad::Variable
- right hand side of the operation
Examples
ctx = Num::Grad::Context(Tensor(Float64)).new
a = ctx.variable([2.0])
b = ctx.variable([3.0])
f = a * b # => [6.0]
f.backprop
Raises a variable to another variable and stores the derivative of the operation in the computational graph.
Arguments
- other :
Num::Grad::Variable
- right hand side of the operation
Examples
ctx = Num::Grad::Context(Tensor(Float64)).new
a = ctx.variable([2.0])
b = ctx.variable([3.0])
f = a ** b # => [8.0]
f.backprop
Adds a variable to another variable and stores the derivative of the operation in the computational graph.
Arguments
- other :
Num::Grad::Variable
- right hand side of the operation
Examples
ctx = Num::Grad::Context(Tensor(Float64)).new
a = ctx.variable([2.0])
b = ctx.variable([3.0])
f = a + b # => [5.0]
f.backprop
Subtracts a variable from another variable and stores the derivative of the operation in the computational graph.
Arguments
- other :
Num::Grad::Variable
- right hand side of the operation
Examples
ctx = Num::Grad::Context(Tensor(Float64)).new
a = ctx.variable([2.0])
b = ctx.variable([3.0])
f = a - b # => [-1.0]
f.backprop
Negates the variable
Examples
ctx = Num::Grad::Context(Tensor(Float64, CPU(Float64))).new
x = ctx.variable([1.0, 2.0])
-x # => [-1.0, -2.0]
Divides a variable by another variable and stores the derivative of the operation in the computational graph.
Arguments
- other :
Num::Grad::Variable
- right hand side of the operation
Examples
ctx = Num::Grad::Context(Tensor(Float64)).new
a = ctx.variable([2.0])
b = ctx.variable([3.0])
f = a / b # => [0.66667]
f.backprop
Slices a variable. Slices the gradient of the variable using the same arguments
Arguments
- args - Slicing arguments, slicing behavior is the same as
it is for a standard
Tensor
Examples
ctx = Num::Grad::Context(Tensor(Float64)).new
a = ctx.variable([[2.0], [3.0]])
b = a[1]
b # => [3]
Computes the arccosine of a variable
Examples
ctx = Num::Grad::Context(Tensor(Float64, CPU(Float64))).new
x = ctx.variable([1.0])
x.acos # => [0]
Computes the arcsine of a variable
Examples
ctx = Num::Grad::Context(Tensor(Float64, CPU(Float64))).new
x = ctx.variable([1.0])
x.asin # => [1.5708]
Computes the arctangent of a variable
Examples
ctx = Num::Grad::Context(Tensor(Float64, CPU(Float64))).new
x = ctx.variable([1.0])
x.atan # => [0.785398]
Back propogates an operation along a computational graph. This operation will destroy the operational graph, populating the gradients for all variables that are predecessors of the Variable this is called on.
Even if this is called on the first node in a graph, it will destroy all descendents of this variable stored by the Context
The graph the variable is associated with. This is a reference, as a variable does not own its context
Computes the cosine of a variable
Examples
ctx = Num::Grad::Context(Tensor(Float64, CPU(Float64))).new
x = ctx.variable([1.0])
x.cos # => [0.540302]
Computes the exp of a variable
Examples
ctx = Num::Grad::Context(Tensor(Float64, CPU(Float64))).new
x = ctx.variable([1.0])
x.exp # => [2.71828]
The gradient of the Variable. This is set as a reference to
the value of a Variable unless #backprop
has been called, in
which case all related Variables will have their gradient
updated correctly
The gradient of the Variable. This is set as a reference to
the value of a Variable unless #backprop
has been called, in
which case all related Variables will have their gradient
updated correctly
Computes the log of a variable
Examples
ctx = Num::Grad::Context(Tensor(Float64, CPU(Float64))).new
x = ctx.variable([2.7182818285])
x.log # => [1.0]
Matrix multiply operator for two variables. Computes the dot product of two matrices and stores the result in the computational graph
Arguments
- other :
Num::Grad::Variable
- right hand side of the operation
Examples
ctx = Num::Grad::Context(Tensor(Float64)).new
a = ctx.variable([[2.0], [2.0]])
b = ctx.variable([[3.0, 3.0]])
f = a.matmul(b)
# [[6, 6],
# [6, 6]]
f.backprop
Reduces a Tensor
along an axis, finding the average of each
view into the Tensor
Arguments
- axis :
Int
- Axis of reduction
Examples
ctx = Num::Grad::Context(Tensor(Float64, CPU(Float64))).new
x = ctx.variable([[1.0, 2.0], [3.0, 4.0]])
x.mean(0) # => [[2.0, 3.0]]
x.mean(1) # => [[1.5], [3.5]]
If set to true, this variable will track its operations, otherwise it will act similar to a Tensor, only calculating forward operations
If set to true, this variable will track its operations, otherwise it will act similar to a Tensor, only calculating forward operations
Computes the sine of a variable
Examples
ctx = Num::Grad::Context(Tensor(Float64, CPU(Float64))).new
x = ctx.variable([1.0])
x.sin # => [0.841471]
Reduces a Tensor
along an axis, summing each view into
the variable
Arguments
- axis :
Int
- Axis of summation
Examples
ctx = Num::Grad::Context(Tensor(Float64, CPU(Float64))).new
x = ctx.variable([[1.0, 2.0], [3.0, 4.0]])
x.sum(0) # => [[4.0, 6.0]]
x.sum(1) # => [[3.0], [7.0]]
Computes the tangent of a variable
Examples
ctx = Num::Grad::Context(Tensor(Float64, CPU(Float64))).new
x = ctx.variable([1.0])
x.tan # => [1.55741]
Computes the tanh of a variable
Examples
ctx = Num::Grad::Context(Tensor(Float64, CPU(Float64))).new
x = ctx.variable([1.0])
x.tanh # => [0.761594156]
The value of the Variable. This should not be edited outside of Variable operations, as other edits will not be tracked and will lead to incorrect results