class SHAInet::ReluLayer
- SHAInet::ReluLayer
- Reference
- Object
Defined in:
shainet/cnn/relu_layer.crConstant Summary
-
Log =
::Log.for(self)
Constructors
-
.new(prev_layer : CNNLayer | ConvLayer, l_relu_slope : Float64 = 0.0)
Add slope to initialize as leaky relu
Instance Method Summary
-
#activate
Go over all neurons of previous layer and apply ReLu or leaky ReLu non-linearity
-
#error_prop(batch : Bool = false)
Send the gradients from current layer backwards without weights
- #filters : Array(Filter)
- #inspect(what : String)
- #l_relu_slope : Float64
- #prev_layer : CNNLayer | ConvLayer
- #update_wb(learn_type : Symbol | String, batch : Bool = false)
Constructor Detail
Add slope to initialize as leaky relu
Instance Method Detail
def error_prop(batch : Bool = false)
#
Send the gradients from current layer backwards without weights