struct Ai4cr::NeuralNetwork::Backpropagation
- Ai4cr::NeuralNetwork::Backpropagation
- Struct
- Value
- Object
Overview
= Introduction
This is an implementation of a multilayer perceptron network, using the backpropagation algorithm for learning.
Backpropagation is a supervised learning technique (described by Paul Werbos in 1974, and further developed by David E. Rumelhart, Geoffrey E. Hinton and Ronald J. Williams in 1986)
= Features
- Support for any network architecture (number of layers and neurons)
- Configurable propagation function
- Optional usage of bias
- Configurable momentum
- Configurable learning rate
- Configurable initial weight function
- 100% Crystal code, no external dependency
= Parameters
Use class method get_parameters_info to obtain details on the algorithm parameters. Use set_parameters to set values for this parameters.
- :bias_disabled => If true, the algorithm will not use bias nodes. False by default.
- :initial_weight_function => f(n, i, j) must return the initial weight for the conection between the node i in layer n, and node j in layer n+1. By default a random number in [-1, 1) range.
- :propagation_function => By default: lambda { |x| 1/(1+Math.exp(-1*(x))) }
- :derivative_propagation_function => Derivative of the propagation function, based on propagation function output. By default: lambda { |y| y*(1-y) }, where y=propagation_function(x)
- :learning_rate => By default 0.25
- :momentum => By default 0.1. Set this parameter to 0 to disable momentum
= How to use it
Create the network with 4 inputs, 1 hidden layer with 3 neurons,
and 2 outputs
net = Ai4cr::NeuralNetwork::Backpropagation.new([4, 3, 2])
Train the network
1000.times do |i| net.train(example[i], result[i]) end
Use it: Evaluate data with the trained network
net.eval([12, 48, 12, 25]) => [0.86, 0.01]
More about multilayer perceptron neural networks and backpropagation:
- http://en.wikipedia.org/wiki/Backpropagation
- http://en.wikipedia.org/wiki/Multilayer_perceptron
= About the project Ported By:: Daniel Huffman Url:: https://github.com/drhuffman12/ai4cr
Based on:: Ai4r Author:: Sergio Fierens License:: MPL 1.1 Url:: http://ai4r.org
Included Modules
- Ai4cr::Breed::Client
- JSON::Serializable
Defined in:
ai4cr/neural_network/backpropagation.crConstructors
- .new(pull : JSON::PullParser)
- .new(structure : Array(Int32), bias_disabled : Bool | Nil = nil, learning_rate : Float64 | Nil = nil, momentum : Float64 | Nil = nil, history_size : Int32 = 10)
Instance Method Summary
- #activation_nodes : Array(Array(Float64))
- #activation_nodes=(activation_nodes : Array(Array(Float64)))
-
#backpropagate
Propagate error backwards
- #bias_disabled : Bool
- #bias_disabled=(bias_disabled : Bool)
-
#calculate_error_distance
Calculate quadratic error for a expected output value Error = 0.5 * sum( (expected_value[i] - output_value[i])**2 )
-
#calculate_internal_deltas
Calculate deltas for hidden layers
-
#calculate_output_deltas
Calculate deltas for output layer
- #check_input_dimension(inputs)
- #check_output_dimension
- #deltas : Array(Array(Float64))
- #derivative_propagation_function
-
#eval(input_values)
Evaluates the input.
-
#eval_result(input_values)
Evaluates the input and returns most active node E.g.
- #expected_outputs : Array(Float64)
- #expected_outputs=(expected_outputs : Array(Float64))
-
#feedforward(input_values)
Propagate values forward
- #guesses_as_is
-
#guesses_best
GUESSES
- #guesses_bottom_n(n = @activation_nodes.last.size)
- #guesses_ceiled
- #guesses_rounded
- #guesses_sorted
- #guesses_top_n(n = @activation_nodes.last.size)
- #height
- #hidden_qty
-
#init_activation_nodes
Initialize neurons structure.
-
#init_last_changes
Momentum usage need to know how much a weight changed in the previous training.
-
#init_network
Initialize (or reset) activation nodes and weights, with the provided net structure and parameters.
-
#init_weights
Initialize the weight arrays using function specified with the initial_weight_function parameter
- #initial_weight_function
- #last_changes : Array(Array(Array(Float64)))
- #last_changes=(last_changes : Array(Array(Array(Float64))))
- #learning_rate : Float64
- #learning_rate=(learning_rate : Float64)
- #learning_styles
- #load_expected_outputs(expected_outputs)
- #momentum : Float64
- #momentum=(momentum : Float64)
- #propagation_function
- #structure : Array(Int32)
- #structure=(structure : Array(Int32))
-
#train(inputs, outputs)
This method trains the network using the backpropagation algorithm.
-
#update_weights
Update weights after @deltas have been calculated.
- #weights : Array(Array(Array(Float64)))
- #weights=(weights : Array(Array(Array(Float64))))
- #width
Instance methods inherited from module Ai4cr::Breed::Client
birth_id : Int32
birth_id,
birth_id=(birth_id : Int32)
birth_id=,
breed_delta : Float64
breed_delta,
breed_delta=(breed_delta : Float64)
breed_delta=,
clone
clone,
error_hist_stats(in_bw = false)
error_hist_stats,
error_stats
error_stats,
error_stats=(error_stats)
error_stats=,
history_size
history_size,
name : String
name,
name=(name : String)
name=,
parent_a_id : Int32
parent_a_id,
parent_a_id=(parent_a_id : Int32)
parent_a_id=,
parent_b_id : Int32
parent_b_id,
parent_b_id=(parent_b_id : Int32)
parent_b_id=
Constructor Detail
Instance Method Detail
Calculate quadratic error for a expected output value Error = 0.5 * sum( (expected_value[i] - output_value[i])**2 )
Evaluates the input. E.g. net = Backpropagation.new([4, 3, 2]) net.eval([25, 32.3, 12.8, 1.5]) # => [0.83, 0.03]
Evaluates the input and returns most active node E.g. net = Backpropagation.new([4, 3, 2]) net.eval_result([25, 32.3, 12.8, 1.5]) # eval gives [0.83, 0.03] # => 0
Momentum usage need to know how much a weight changed in the previous training. This method initialize the @last_changes structure with 0 values.
Initialize (or reset) activation nodes and weights, with the provided net structure and parameters.
Initialize the weight arrays using function specified with the initial_weight_function parameter
This method trains the network using the backpropagation algorithm.
input: Networks input
output: Expected output for the given input.
This method returns the network error: => 0.5 * sum( (expected_value[i] - output_value[i])**2 )