module SHAInet
Included Modules
- Apatite
Defined in:
shainet.crshainet/basic/es.cr
shainet/basic/exceptions.cr
shainet/basic/exprimental.cr
shainet/basic/layer.cr
shainet/basic/network_run.cr
shainet/basic/network_setup.cr
shainet/basic/neuron.cr
shainet/basic/synapse.cr
shainet/cnn/cnn.cr
shainet/cnn/cnn_input_layer.cr
shainet/cnn/cnn_synapse.cr
shainet/cnn/conv_layer.cr
shainet/cnn/drop_out_layer.cr
shainet/cnn/fc_layer.cr
shainet/cnn/filter.cr
shainet/cnn/max_pool_layer.cr
shainet/cnn/relu_layer.cr
shainet/cnn/softmax_layer.cr
shainet/data/cnn_data.cr
shainet/data/data.cr
shainet/data/json_data.cr
shainet/data/test_data.cr
shainet/data/training_data.cr
shainet/math/functions.cr
shainet/math/random_normal.cr
Constant Summary
-
NEURON_TYPES =
["memory", "eraser", "amplifier", "fader", "sensor"]
-
Each type of neuron uses and propogates data differently
Class Method Summary
- ._bp_sigmoid(value : GenNum) : Float64
- ._bp_sigmoid_prime(value : GenNum) : Float64
- ._cross_entropy_cost(expected : Float64, actual : Float64) : Float64
- ._cross_entropy_cost_derivative(expected : Float64, actual : Float64) : Float64
- ._l_relu(value : GenNum, slope : Float64 = 0.01) : Float64
- ._l_relu_prime(value : GenNum, slope : Float64 = 0.01) : Float64
- ._log_sigmoid(value : GenNum) : Float64
- ._log_sigmoid_prime(value : GenNum) : Float64
- ._quadratic_cost(expected : Float64, actual : Float64) : Float64
- ._quadratic_cost_derivative(expected : Float64, actual : Float64) : Float64
- ._relu(value : GenNum)
- ._relu_prime(value : GenNum) : Float64
- ._sigmoid(value : GenNum) : Float64
- ._sigmoid_prime(value : GenNum) : Float64
- ._tanh(value : GenNum) : Float64
- ._tanh_prime(value : GenNum) : Float64
- .bp_sigmoid : ActivationFunction
- .cross_entropy_cost : CostFunction
- .l_relu : ActivationFunction
- .log_sigmoid : ActivationFunction
-
.log_softmax(array : Array(GenNum)) : Array(Float64)
Not working yet, do not use
- .none : ActivationFunction
-
.normalize_stcv(payloads : Array(String))
translate an array of strings to one-hot vector matrix and hash dictionary
- .quadratic_cost : CostFunction
- .relu : ActivationFunction
- .sigmoid : ActivationFunction
-
.sign(input : GenNum)
Used in Rprop
- .softmax(array : Array(GenNum)) : Array(Float64)
-
.softmax_prime(array : Array(GenNum)) : Array(Float64)
The input array in this case has to be the output array of the softmax function
- .tanh : ActivationFunction
Class Method Detail
def self._cross_entropy_cost_derivative(expected : Float64, actual : Float64) : Float64
#
def self._quadratic_cost_derivative(expected : Float64, actual : Float64) : Float64
#
Not working yet, do not use
def self.normalize_stcv(payloads : Array(String))
#
translate an array of strings to one-hot vector matrix and hash dictionary
The input array in this case has to be the output array of the softmax function