YAGA - Yet Another Genetic Algorithm

YAGA is a genetic multilayer algorithm supporting different layers types.

Index

Installation

  1. Add the dependency to your shard.yml:

    dependencies:
      yaga:
        github: fruktorum/yaga
  2. Run shards install

Usage

Basic usage taken from Lesson 1, full algorithm is located in Horizontal-Vertical recognition folder.

1. Require the engine

require "yaga"

Chromosomes does not loads automatically - just a core engine.
You can develop your own chromosomes for your project by inheriting from YAGA::Chromosome or by using presets in chromosomes:

require "yaga/chromosomes/binary_neuron"

Please read the documentation about chromosome.
They could have external dependencies that should be added to your shard.yml (example: MatrixMultiplicator with requiring of SimpleMatrix shard).

2. Compile genome model

Genome builds on compile-time and based on StaticArrays to achieve the highest possible performance.

YAGA::Genome.compile(
  # Generated genome class           Inputs type (array)       Inputs size
  BinaryGenome                     , BitArray                , 9          ,

  # Activator                        Activations type (array)  Outputs size
  { YAGA::Chromosomes::BinaryNeuron, BitArray                , 4            },
  { YAGA::Chromosomes::BinaryNeuron, BitArray                , 2            }
)
  1. BinaryGenome is a class name of building genome. It can be any that Crystal supports.
  2. BitArray - the type of input that the model works with. Should be an array (StaticArray/Array/Set/etc that has #[], #[]= and << methods).
  3. 9 - number of elements that passes in.
  4. { Chromosome, output_type, output_amount } - genome layers:
    1. YAGA::Chromosomes::BinaryNeuron - chromosome class. Internally should have the same inputs type as outputs of previous layer. In case of the first layer after inputs - should have the inputs type.
    2. BitArray - layer's outputs type. Like an inputs, should be an array.
    3. 4 - number of outputs (note that inputs are taken from outputs of the layer before).

As you can see, like a neural networks each layer (i.e. each Chromosome) can manage its own data types. Please see examples for more complicated use cases.

3. Prepare the data

inputs = Array( BitArray ).new( 16 ){ BitArray.new 9 }
outputs = Array( BitArray ).new( 16 ){ BitArray.new 2 }

Please note that array of inputs has the same size as the model inputs; array of outputs - the same as model outputs.

Fill the inputs and outputs somehow (for example it can be the horizontal and vertical recognition example).

4. Create population based on compiled genome

random = Random.new

# Arguments are:
# 1. Total Population
# 2. Selection
# 3. Mutation Chance
# 4. Should crossover be enabled
# 5. Custom random for deterministic behaviour
population = YAGA::Population( BinaryGenome, UInt32 ).new 64, 8, 10, true, random

It is also available to initialize population with named arguments:

random = Random.new
population = YAGA::Population( BinaryGenome, UInt32 ).new total_bots: 64,
                                                          selection_bots: 8,
                                                          mutation_percent: 10,
                                                          crossover_enabled: true,
                                                          random: random

Generic parameters

Initialization parameters (with defaults)

5. Train bots

Samples in this section based on Example 1 - Horizontal-Vertical and its BitArray inputs vector.

Version 1: #train_each

# It would be better to see the progress
require "progress"

goal = 16 # Please make sure the type is matched to population fitness type
generation_cap = 30000_u64

bar = ProgressBar.new( ( generation_cap * population.total_bots ).to_i )

simulations_passed = population.train_each( goal, generation_cap ){|bot, generation|
  fitness = 0 # How good the bot is

  inputs.each_with_index{|input, index|
    activation = bot.activate input # Last genome layer calculation result
    fitness += 1 if activation == outputs[ index ] # Calculate fitness
  }

  bar.inc

  fitness
}

p simulations_passed: simulations_passed # Amount of simulations

Version 2: #train_world

# It would be better to see the progress
require "progress"

def run_simulation( bots : Array( BinaryGenome ), inputs : Array( BitArray ) ) : Void
  bots.each{|bot|
    fitness = 0 # How good the bot is

    inputs.each_with_index{|input, index|
      activation = bot.activate input # Last genome layer calculation result
      fitness += 1 if activation == outputs[ index ] # Calculate fitness
    }

    bot.fitness = fitness
  }
end

goal = 16 # Please make sure the type is matched to population fitness type
generation_cap = 30000_u64

bar = ProgressBar.new generation_cap.to_i

simulations_passed = population.train_world( goal, generation_cap ){|bots, generation|
  run_simulation bots, inputs
  bar.inc
}

p simulations_passed: simulations_passed # Amount of simulations

Notes

6. Take the leader

bot = population.selection.first
dna_genes = bot.genome.dna.map{ |chromosomes| chromosomes.map{ |chromosome| chromosome.genes } }

p genome: dna_genes,
  generation:  bot.generation, # Bot generation - on what simulation appeared bot's genome state
  max_fitness: bot.fitness,    # Bot's personal training fitness result (on the last simulation)
  brain_size:  bot.brain_size  # Total number of genes

7. Save/Load the state

# To save genome - just say the bot to show json
best = population.bots.max_by &.fitness
genome = best.to_json
p genome

# To restore genome - just say the bot to read json
bot = YAGA::Bot( BinaryGenome, UInt32 ).from_json genome

# To restore the same genome for all population - apply the loaded bot's genome to this
population.bots.each &.replace( bot )

There is no option to load genome into population itself. Please see Example 3 - Snake Game about that.
In short - from_json works completely correctly and stable only for YAGA::Bot class itself, it is not responsible for programmer-defined classes (like a Game::Snake class in described example).

to_json have less problems with that - until it is redefined in custom classes. And can be used on subclasses of YAGA::Bot directly.

8. Exploitation

# Get a bot result:
p bot.activate( inputs.sample )

Version 1: #simulate_each

input = inputs.sample

# Launch the population simulation per bot:
population.simulate_each{|bot|
  p bot.activate( input )
}

Version 2: #simulate_world

input = inputs.sample

# Launch the population simulation for all bots:
population.simulate_world{|bots|
  p bots.map( &.activate( input ) )
}

9. Population Callbacks

Each callback can be defined at any time and assigns to population object.

Callbacks will be launched in the same order as mentioned.

All methods yield population generation on the first argument.
Only one callback of the same type can be assigned at the same time (it is not possible to define double before_simulation callbacks and etc.).

before_training and after_training arguments:

Example:

bar = ProgressBar.new 0

# Define before_training callback and reset progress bar size
population.before_training{|generation, goal, training_generations|
  # Show some statistics before training
  p previons_generation: generation, goal_to_train: goal, training_generations: training_generations

  bar.total = training_generations.to_i32
  bar.set 0
}

# Show evolutions statistics (warning: it is calling per each simulation on evolution process)
population.after_evolution{|generation|
  # It is possible to see some statistics
  # p new_generation: generation, max_fitness: population.selection.max_by( &.fitness )
  # but using progress bar can be more viable
  bar.inc
}

population.train_world( goal: 1.2, generation_cap: 10000 ){|bots, generation|
  # ...training logic...
}

10. Genetic functions

If you'd like to use your own genetic functions it is possible to override the default ones or create the inherited class:

class CustomPopulation < YAGA::Population( MyDNA, Float64 )
  def crossover : Void
    # Write your crossover function here
  end

  def mutate : Void
    # Write your mutation algorithm here
  end

  def finalize_evolution : Void
    # By default, this overrides last 5 bots to prevent stagnation
    # You can leave it empty if it is not needed in your case
  end
end

Development

All PRs are welcome!

Contributing

  1. Fork it (https://github.com/fruktorum/yaga/fork)
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create a new Pull Request

Contributors

Thanks

Special thanks to the resources without which nothing like this would have been implemented.