module SHAInet

Included Modules

Defined in:

shainet.cr
shainet/basic/es.cr
shainet/basic/exceptions.cr
shainet/basic/exprimental.cr
shainet/basic/layer.cr
shainet/basic/network_run.cr
shainet/basic/network_setup.cr
shainet/basic/neuron.cr
shainet/basic/synapse.cr
shainet/cnn/cnn.cr
shainet/cnn/cnn_input_layer.cr
shainet/cnn/cnn_synapse.cr
shainet/cnn/conv_layer.cr
shainet/cnn/drop_out_layer.cr
shainet/cnn/fc_layer.cr
shainet/cnn/filter.cr
shainet/cnn/max_pool_layer.cr
shainet/cnn/relu_layer.cr
shainet/cnn/softmax_layer.cr
shainet/data/cnn_data.cr
shainet/data/data.cr
shainet/data/json_data.cr
shainet/data/test_data.cr
shainet/data/training_data.cr
shainet/math/functions.cr
shainet/math/random_normal.cr

Constant Summary

NEURON_TYPES = ["memory", "eraser", "amplifier", "fader", "sensor"]

Each type of neuron uses and propogates data differently

Class Method Summary

Class Method Detail

def self._bp_sigmoid(value : GenNum) : Float64 #

[View source]
def self._bp_sigmoid_prime(value : GenNum) : Float64 #

[View source]
def self._cross_entropy_cost(expected : Float64, actual : Float64) : Float64 #

[View source]
def self._cross_entropy_cost_derivative(expected : Float64, actual : Float64) : Float64 #

[View source]
def self._l_relu(value : GenNum, slope : Float64 = 0.01) : Float64 #

[View source]
def self._l_relu_prime(value : GenNum, slope : Float64 = 0.01) : Float64 #

[View source]
def self._log_sigmoid(value : GenNum) : Float64 #

[View source]
def self._log_sigmoid_prime(value : GenNum) : Float64 #

[View source]
def self._quadratic_cost(expected : Float64, actual : Float64) : Float64 #

[View source]
def self._quadratic_cost_derivative(expected : Float64, actual : Float64) : Float64 #

Derivatives of cost functions #


[View source]
def self._relu(value : GenNum) #

[View source]
def self._relu_prime(value : GenNum) : Float64 #

[View source]
def self._sigmoid(value : GenNum) : Float64 #

[View source]
def self._sigmoid_prime(value : GenNum) : Float64 #

[View source]
def self._tanh(value : GenNum) : Float64 #

[View source]
def self._tanh_prime(value : GenNum) : Float64 #

[View source]
def self.bp_sigmoid : ActivationFunction #

[View source]
def self.cross_entropy_cost : CostFunction #

[View source]
def self.l_relu : ActivationFunction #

[View source]
def self.log_sigmoid : ActivationFunction #

[View source]
def self.log_softmax(array : Array(GenNum)) : Array(Float64) #

Not working yet, do not use


[View source]
def self.none : ActivationFunction #

[View source]
def self.normalize_stcv(payloads : Array(String)) #

translate an array of strings to one-hot vector matrix and hash dictionary


[View source]
def self.quadratic_cost : CostFunction #

[View source]
def self.relu : ActivationFunction #

[View source]
def self.sigmoid : ActivationFunction #

[View source]
def self.sign(input : GenNum) #

Used in Rprop


[View source]
def self.softmax(array : Array(GenNum)) : Array(Float64) #

[View source]
def self.softmax_prime(array : Array(GenNum)) : Array(Float64) #

The input array in this case has to be the output array of the softmax function


[View source]
def self.tanh : ActivationFunction #

[View source]