PowerPoint Presentation
Lecture 4:
Beyond Classical Search
Genetic Algorithm (GA)
C.-C. Hung
Kennesaw State University
(Slides used in the classroom only)
Lecture overview
Chapter 4: Beyond Classical Search
Hill Climbing
Simulated Annealing
Local Beam Search
Genetic Algorithms
http://rogeralsing.com/2008/12/07/genetic-programming-evolution-of-mona-lisa/
Search: An Abstract Example
Distribution of Individuals in Generation 0
Distribution of Individuals in Generation N
Genetic Algorithms (GA)
Acknowledgement: Some slides in this chapter are taken and modified from those used by Richard Frankel in Stanford University
Classes of Search Techniques
Search Techniques
Calculus-Based Techniques
Guided random search techniques
Enumerative Techniques
BFS
DFS
Dynamic Programming
Tabu Search
Hill Climbing
Simulated Annealing
Evolutionary Algorithms
Genetic Programming
Genetic Algorithms
Fibonacci
Sort
Motivation
Searching some search spaces with traditional search methods would be intractable. This is often true when states/candidate solutions have a large number of successors.
Example: Designing the surface of an aircraft.
Image source: http://www.centaursoft.com/cms/index.php?section=25
6
Images from http://www.centaursoft.com/cms/index.php?section=25
6
Applicable situations
Often used for optimization (scheduling, design, etc.) problems, though can be used for many other things as well, as we’ll see a bit later.
Good problems for GA: Scheduling air traffic
Bad problems for GA: Finding large primes and 2D pathfinding (why?)
7
Finding large primes is a bad problem for GA because the fitness landscape is not continuous – hard to differentiate between something 1 bit away from a prime, and something that is the complement of a prime.
2D pathfinding bad because the search space is small, so better off with A*.
7
Applicable situations
Genetic algorithms work best when the “fitness landscape” is continuous (in some dimensions). This is also true of standard search, e.g. A*.
Intuitively, this just means that we can find a heuristic that gives a rough idea of how close a candidate is to being a solution.
Image source: scholarpedia.org
8
So what is a genetic algorithm?
Genetic algorithms are a randomized heuristic search strategy.
Genetic algorithms are search algorithms based on the mechanics of natural selection and natural genetics (called Charles Darwin’s Theory of Natural Evolution).
9
A genetic algorithm maintains a population of candidate solutions for the problem at hand, and makes it evolve by iteratively applying a set of stochastic operators …
Formal Definition
So what is a genetic algorithm?
Basic idea: Simulate natural selection, where the population is composed of candidate solutions.
Focus is on evolving a population from which strong and diverse candidates can emerge via mutation and crossover (mating).
11
Basic algorithm
Create an initial population, either random or other methods.
While the best candidate so far is not a solution:
Create new population using successor functions.
Evaluate the fitness of each candidate in the population.
Return the best candidate found.
12
Components of a GA
A problem definition as input,
Encoding principles/Representation (gene, chromosome)
Initial population (creation)
Genetic operators (crossover, mutation)
Selection of parents (reproduction)
Evaluation function (environment)
Termination condition (to get final solutions)
GA Terminology
Terminology
The strings of artificial genetic systems are analogous to chromosomes in biological systems. In natural systems, one or more chromosomes are combined to form the total genetic prescription for the construction and operation of some organism.
strings are composed of features or detectors, which take on different values. Features may be located at different positions on the string in artificial genetic search.
14
GA Terminology
Terminology
genotype: the genetic material of an individual is known as the genotype in natural systems.
phenotype: the manifestation of genotype as an organism is known as the phenotype natural systems.
a structure: the total package of strings in artificial genetic systems. The structures decoded to form a particular parameter set, solution alternative, or point
15
GA Terminology
Terminology
alleles: chromosomes are composed of genes, which may take on some number of values.
locus: the position of a gene. It is identified separately from the gene’s function in genetics.
Example for a particular gene :
an animal’s eye color gene, its locus, position 10, and its alleles values, blue eyes
16
Natural vs. GA
Comparison of Natural and GA Terminology
Natural Genetic Algorithms
chromosome String (Candidate solution)
gene feature, character, or detector
allele feature value
locus string position
genotype representation of a string
phenotype solution
epistasis Nonlinearity
Suitability of a Representation to Genetic Algorithms.
– Different representations incorporate varying degrees of nonlinearity among the representation elements.
17
Basic components
Candidate Representation
Important to choose this well. More work here means less work on the successor functions.
Initial Population
Successor function(s)
Mutation and crossover
Re-production (selection of parents)
Fitness/Evaluation function
Maintain a same number of members
Some parameters
Population size
Generation limit
Probabilities: Pc and Pm
18
1. Candidate Representation
Candidate Representation
We want to encode candidates in a way that makes mutation and crossover easy.
The typical candidate representation is a binary string. This string can be thought of as the genetic code of a candidate – thus the term “genetic algorithm”!
Other representations are possible, but they make crossover and mutation harder.
20
Rule base system
Given a rule (if color = red and size = small and shape = round then object = apple).
Assume that each feature has finite set of values (e.g., size = small, large)
Represent the value as a substring of length equal to the number of possible values. For example, small = 10, large = 01.
The entire rule would be 100 10 01 0100 – set of rules concatenating the values together.
Candidate Representation: An example
Let’s say we want to represent a rule for classifying bikes as mountain bikes or hybrid, based on these attributes*:
Make (Bridgestone, Cannondale, Nishiki, or Gary Fisher)
Tire type (Knobby, Treads)
Handlebar type (Straight, Curved)
Water bottle holder (Boolean)
We can encode a rule as a binary string, where each bit represents whether a value is accepted (on or off).
*Bikes scheme used with permission from Mark Maloof.
Make Tires Handlebars Water bottle
B
C
N
G
K
T
S
C
Y
N
22
Candidate Representation: An example
The candidate will be a bit string of length 10, because we have 10 possible attribute values.
Let’s say we want a rule that will match any bike that is made by Bridgestone or Cannondale, has treaded tires, and has straight handlebars. This rule could be represented as 1100011011:
Make Tires Handlebars Water bottle
1
1
0
0
0
1
1
0
1
1
B
C
N
G
K
T
S
C
Y
N
23
2. Initial Population
Initial Population
Population Size
It depends on the problem being solved.
Some parameters need be set:
Generation limit
Probabilities: Pc and Pm
25
3. Genetic Operators (Successor Functions)
Successor functions
Mutation – Given a candidate, return a slightly different candidate.
Crossover – Given two candidates, produce one that has elements of each.
We don’t always generate a successor for each candidate. Rather, we generate a successor population based on the candidates in the current population, weighted by fitness.
27
Successor functions
If your candidate representation is just a binary string, then these are easy:
Mutate(c): Copy c as c’. For each bit b in c’, flip b with probability pm. Return c’.
Cross (c1, c2): Create a candidate c such that c[i] = c1[i] if i % 2 = 0, c[i] = c2[i] otherwise. Return c.
Alternatively, any other scheme such that c gets roughly equal information from c1 and c2 with probability Pc.
28
How offspring are produced – Reproduction – Changing the states
Reproduction – Through reproduction, genetic algorithms produce new generations of improved solutions by selecting parents with higher fitness ratings or by giving such parents a greater probability of being contributors and by using random selection.
How offspring are produced
Crossover means choosing a random position in the string (say, after 2 digits) and exchanging the segments either to the right or to the left of this point with another string partitioned similarly to produce two new off spring.
Crossover Example
Parent A 011011
Parent B 101100
“Mate the parents by splitting each number as shown between the second and third digits (position is randomly selected)
01*1011 10*1100
Crossover Example
Now combine the first digits of A with the last digits of B, and the first digits of B with the last digits of A
This gives you two new offspring
011100
101011
Crossover Operators
Singlepoint crossover:
Parent A: 1 0 0 1 0| 1 1 1 0 1
Parent B: 0 1 0 1 1 |1 0 1 1 0
Child AB: 1 0 0 1 0 1 0 1 1 0
Child BA: 0 1 0 1 1 1 1 1 0 1
Twopoint crossover:
Parent A: 1 0 0 1 0 1 1 1 0 1
Parent B: 0 1 0 1 1 1 0 1 1 0
Child AB: 1 0 0 1 1 1 0 1 1 0
Child BA: 0 1 0 1 0 1 1 1 0 1
Inversion
Another possible operator to mix things up
Takes the material from only on parent
Randomly chooses two points
Takes the segment in-between the points
Reverses it in the offspring
Example:
How offspring are produced cont.
Mutation- Mutation is an arbitrary change in a situation. Sometimes it is used to prevent the algorithm from getting stuck. The procedure changes a 1 to a 0 to a 1 instead of duplicating them. This change occurs with a very low probability (say 1 in 1000)
1 0 1 0 1 1 1
1 1 0 0 0 1 1
Parent 1
Parent 2
1 0 1 0 0 1 1
1 1 0 0 1 1 0
Child 1
Child 2
Mutation
Genetic Algorithm Operators
Mutation and Crossover
Crossover – Value Encoding
Crossover
All crossovers from binary encoding can be used
Mutation
Adding a small number (for real value encoding) – to selected values is added (or subtracted) a small number
(1.29 5.68 2.86 4.11 5.55) => (1.29 5.68 2.73 4.22 5.55)
4. Reproduction/Selection of Parents
(i.e. next generation)
Fitness Function
GA learns by producing offspring that are better and better as measured by a fitness function, which is a measure of the objective to be obtained (maximum or minimum).
The fitness function is defined based on the application domains which are sometimes difficult to be determined.
Fitness function
The fitness function is analogous to a heuristic that estimates how close a candidate is to being a solution.
In general, the fitness function should be consistent for better performance. However, even if it is, there are no guarantees. This is a probabilistic algorithm!
In our classification rule example, one possible fitness function would be information gain (???) over training data.
40
3 Reproduction Operators:
1. Biased roulette wheel selection:
2. Tournament selection:
3. Elite selection:
Selection Criteria for Reproduction
41
Reproduction operator:
Biased roulette wheel selection: each current string in the population has a roulette wheel slot sized in proportion of its fitness.
each individual, I, has the probability fitness(I)/sum_over_all_individual_j Fitness(j), where Fitness(I) is the fitness function value for individual I.
Selection Criteria
42
Biased roulette wheel selection
(4)
30.9%
(1)
14.4%
(2)
49.2%
(3) 5.5%
43
Scores to Probabilities
(Biased roulette wheel selection)
Suppose the scores of the n individuals are:
a[1], a[2], …, a[n].
The probability of choosing the jth individual
prob = a[j]/(a[1]+a[2]+…, +a[n]).
Reproduction operator:
Tournament selection: Genotypes (strings) are randomly paired. The individual with the higher fitness “wins” and survives to the next generation. The lower fitness individual is discarded.
Elite selection: Genotypes (strings) are rank ordered according to their fitness. The bottom half of the ranking is discarded, the remaining doubled, leaving the best for the next generation.
Selection Criteria
45
Solution test
Given a candidate, return whether the candidate is a solution.
Often just answers the question “does the candidate satisfy some set of constraints?”
Optional! Sometimes you just want to do the best you can in a given number of generations, e.g. the classification rule example.
46
New population generation
How do we come up with a new population?
Given a population P, generate P’ by performing crossover |P| times, each time selecting candidates with probability proportional to their fitness.
Get P’’ by mutating each candidate in P’.
Return P’’.
47
Mutation is necessary because some important gene might be missing from all of the initial population.
47
New population generation
That was just one approach – be creative!
That approach doesn’t explicitly allow candidates to survive more than one generation – this might not be optimal.
Crossover is not necessary, though it can be helpful in escaping local maxima. Mutation is necessary (why?).
48
Mutation is necessary because some important gene might be missing from all of the initial population.
48
Recap: Flow Diagram of the Genetic Algorithm Process
Describe
Problem
Generate
Initial
Solutions
Test: is initial
solution good enough?
Stop
Select parents
to reproduce
Apply crossover process
and create a set of offspring
Apply random mutation
Step 1
Step 2
Step 3
Step 4
Step 5
Yes
No
A Toy Example
A simple example: simplified version
Maximize the function f(x) = x2 on the integer interval [0, 31]
Code the decision variables of our problem as some
finite-length string.
For this problem, we will code the variable x simply as a binary unsigned integer of length 5.
51
Genetic Algorithms …
Step 1: Define the strings (solutions) to be used randomly.
The solution is guaranteed in the range of between 0 and 31. To encode any solution needs 5-bit length.
52
Genetic Algorithms …
Step 2: Start creating an initial population of possible strings randomly.
01101
11000
01000
10011
53
Genetic Algorithms …
Step 3: Evaluate each string by applying a fitness function.
A possible fitness function is x2 (integer). Can we use 1/x2?
01101 fitness value 121
11000 “ 576
01000 “ 64
10011 “ 361
54
Genetic Algorithms …
Step 4: Search for new strings by exchanging genetic materials (i.e. crossover and mutation).
Assume that Tournament selection (strings are randomly paired) is used.
011|01 crossover (1 & 2) 01100
110|00 11001
01|000 “ (3 & 4) 01011
10|011 10000
55
Genetic Algorithms …
Step 4: Search for new strings by exchanging genetic materials (i.e. crossover and mutation).
01100 Mutation 01110
11001 “ 11101
01011 11011
10000 10001
56
Genetic Algorithms …
Step 5: Select the more promising strings (reproduction).
Assume that Ellite selection (strings with good fitness are survived) is used.
01110 fitness value 196
11101 “ 841
11011 “ 729
10001 “ 289
57
Step 5: Select the more promising strings (reproduction).
Select two best strings and randomly generate another two strings to maintain the same population.
11101 (Best fitness 841)
11011 (second best fitness 729)
01000 new string
00100 new string
Genetic Algorithms …
58
Step 6: Check for terminating conditions. If no termination condition is met, return to step 3.
If the iteration (number of generations) is terminated, the string 11101 would be the solution.
Otherwise, the algorithm continues.
Genetic Algorithms …
59
Examples – GA in the wild
Rule set classifier generation
“I tried this as an undergrad, and it worked pretty well. Rather than classifying bikes, I was classifying Congressional representatives by party, based on their voting records.”
General approach:
Use GA to generate a rule for training data, with information gain for the fitness function and no solution test (just a generation limit).
Remove positive examples covered by the rule.
Repeat the above two steps until all positive training examples are covered.
To classify an example: iff the example is matched by any of the rules generated, consider it a positive example.
60
Examples – GA in the wild
ST5 Antenna – NASA Evolvable Systems Group
http://ti.arc.nasa.gov/projects/esg/research/antenna.htm
61
61
ST5 Antenna
Needs less power than standard antennae.
Doesn’t require a matching network nor a phasing circuit – two less steps in design and fabrication.
More uniform coverage than standard antennae, yielding more reliable performance.
Short design cycle – 3 person-months to prototype fabrication, vs. 5-person months on average for conventionally designed antennae.
62
First, there is the potential of needing less power. Antenna ST5-3-10 achieves high gain (2-4dB) across a wider range of elevation angles. This allows a broader range of angles over which maximum data throughput can be achieved. Also, less power from the solar array and batteries may be required.
Second, the evolved antenna does not require a matching network nor a phasing circuit, removing two steps in design and fabrication of the antenna. A trivial transmission line may be used for the match on the flight antenna, but simulation results suggest that one is not required.
Third, the evolved antenna has more uniform coverage in that it has a uniform pattern with small ripples in the elevations of greatest interest (between 40 and 80 degrees). This allows for reliable performance as elevation angle relative to the ground changes.
Finally, the evolved antenna had a shorter design cycle. It was estimated that antenna ST5-3-10 took 3 person-months to design and fabricate the first prototype as compared to 5 person-months for the conventionally designed antenna.
62
Examples – GA in the wild
Image compression – evolving the Mona Lisa
http://rogeralsing.com/2008/12/07/genetic-programming-evolution-of-mona-lisa/
Generation 904314
Generation 2716
Generation 301
Generation 1
63
This weekend I decided to play around a bit with genetic programming and put evolution to the test, the test of fine art 🙂
I created a small program that keeps a string of DNA for polygon rendering. The procedure of the program is quite simple:
0) Setup a random DNA string (application start)
Copy the current DNA sequence and mutate it slightly
Use the new DNA to render polygons onto a canvas
Compare the canvas to the source image
If the new painting looks more like the source image than the previous painting did, then overwrite the current DNA with the new DNA
repeat from 1
Now to the interesting part 🙂
Could you paint a replica of the Mona Lisa using only 50 semi transparent polygons?
Took 904314 generations, though began to be recognizable as the Mona Lisa after 10,000 or so. Images above are generation 1, 301, 2716, and 904314.
63
Evolving the Mona Lisa
Uses only 50 polygons of 6 vertices each.
Population size of 1, no crossover – parent compared with child, and superior image kept.
Assuming each polygon has 4 bytes for color (RGBA) and 2 bytes for each of 6 vertices, this image only requires 800 bytes.
However, compression time is prohibitive and storage is cheaper than processing time.
64
More evolved images
http://rogeralsing.com/2008/12/07/genetic-programming-evolution-of-mona-lisa/
65
GA Example
Problem: N-queens.
Individual: array indicating column where ith queen is assigned.
Mating: Crossover.
Fitness (minimize): number of constraint violations.
GA Example: Function Optimization (Exercise)
Let f(x, y) be the function to be optimized.
Domain for x and y is real number between 0 and 10.
Say the hidden function is:
f(x, y) = 2 if x> 9 & y>9.
f(x, y) = 1 if x>9 or y>9
f(x, y) = 0 otherwise.
GA Works Well here.
Individual = point = (x, y)
Mating: something from each so:
mate({x, y},{x’, y’}) is {x, y’} and {x’, y}.
No mutation
Hill-climbing does poorly, GA does well.
This example generalizes functions with large arity.
GA example
Step 1: Define the solutions to be used randomly.
The solution is guaranteed in the range of between 0 and 10. What is the representation of the solution?
69
Genetic Algorithms …
Step 2: Start creating an initial population of possible solutions randomly.
(0, 0)
(9, 10)
(5, 6)
(10, 9)
70
Genetic Algorithms …
Step 3: Evaluate each string by applying a fitness function.
Hidden functions used for the fitness?
(0, 0) —-> 0
(9, 10) —-> 1
(5, 6) —-> 0
(10, 9) —-> 1
71
Genetic Algorithms …
Step 4: Search for new strings by exchanging genetic materials (i.e. crossover).
Assume that Tournament selection (strings are randomly paired) is used. Assume crossover probability Pc is satisfied.
(0, 0) crossover (1 & 2) (0, 10)
(9, 10) (9, 0)
(5, 6) “ (3 & 4) (5, 9)
(10, 9) (10, 6)
72
Genetic Algorithms …
Step 4: Search for new strings by exchanging genetic materials (i.e. crossover and mutation).
Assume mutation probability Pm is satisfied.
(0, 10) mutation (10, 10)
(9, 0) “ (8, 0)
(5, 9) “ (5, 10)
(10, 6) “ (6, 6)
73
Genetic Algorithms …
Step 5: Select the more promising strings (reproduction).
Assume that Ellite selection (strings with good fitness are survived) is used.
(10, 10) fitness value 2
( 8, 0) “ 0
( 5, 10) “ 1
( 6, 6) “ 0
74
Genetic Algorithms …
Step 5: Select the more promising strings (reproduction).
Assume that Ellite selection (strings with good fitness are survived) is used.
(10, 10) —> 2
(5, 10) —> 1
(8, 0) —> 0
(6, 6) —> 0
75
Step 5: Select the more promising strings (reproduction).
Select two best strings and randomly generate another two strings to maintain the same population.
(10, 10) (Best fitness 2)
( 5, 10) (second best fitness 1)
( 9, 9) new string
( 8, 9) new string
Genetic Algorithms …
76
Step 6: Check for terminating conditions. If no termination condition is met, return to step 3.
If the iteration (number of generations) is terminated, (10, 10) would be the solution.
Otherwise, the algorithm continues.
Genetic Algorithms …
77
Conclusion
Basic algorithm (recap)
Create an initial population, either random or other methods.
While the best candidate so far is not a solution:
Create new population using successor functions.
Evaluate the fitness of each candidate in the population.
Return the best candidate found.
79
Genetic Algorithms
Step 1: Define the strings (solutions) to be used.
Step 2: Start creating an initial population of possible strings.
Step 3: Evaluate each string by applying a fitness function.
Step 4: Search for new strings by exchanging genetic materials (i.e. crossover and mutation).
Step 5: Select the more promising strings (reproduction).
Step 6: Check for terminating conditions. If no termination condition is met, return to step 3.
80
Genetic Algorithms
Another class of iterative improvement algorithms
A genetic algorithm maintains a population of candidate solutions for the problem at hand, and makes it evolve by iteratively applying a set of stochastic operators.
Inspired by the biological evolution process
Uses concepts of “Natural Selection” and “Genetic Inheritance” (Darwin 1859)
Originally developed by John Holland (1975)
Pros and Cons
Pros
Faster (and lower memory requirements) than searching a very large search space.
Easy, in that if your candidate representation and fitness function are correct, a solution can be found without any explicit analytical work.
Cons
Randomized – not optimal or even complete.
Can get stuck on local maxima, though crossover can help mitigate this.
It can be hard to work out how best to represent a candidate as a bit string (or otherwise).
82
When to Use Artificial Evolution
Artificial evolution is better employed in situations where conventional optimization techniques cannot find a satisfactory solution, for example when the function to be optimized is discontinuous, nondifferentiable, and/or presents too many nonlinearly related parameters.
In other words, artificial evolution is often used on hard problems where other optimization methods fail or are trapped in suboptimal solutions.
Differences from Traditional Methods
Work with a coding of the parameter set.
Not the parameters itself
Search from population of points.
Use probabilistic transition rules (Pc and Pm).
Use objective/fitness functions
Not derivatives.
Other auxiliary knowledge.
Natural vs. Artificial Evolution
In natural evolution the fitness of an individual is defined by its reproductive success (number of offspring).
In artificial evolution the fitness of an individual is a function that measures how well that individual solves a predefined problems.
Applications of GAs
Dynamic process control
Induction of rule optimization
Discovering new connectivity topologies
Simulating biological models of behavior and evolution
Complex design of engineering structures
Pattern recognition
Scheduling
Transportation
Layout and circuit design
Telecommunication
Graph-based problems
Business Applications
Schedule Assembly lines at Volvo Truck North America
Channel 4 Television (England) to schedule commercials
Driver scheduling in a public transportation system
Jobshop scheduling
Assignment of destinations to sources
Trading stocks
Productivity in whisky-making is increased
Often genetic algorithm hybrids with other AI methods
Some published success: Scheduling
Manufacturing scheduling a major area of application
Optimax acquired by i2 in 1997 for $53M US
Newsweek, Forbes, Wall Street Journal, etc. articles on scheduling successes with GAs
Barcelona Special Olympics successfully scheduled for the first time with a GA
Some published success: Telecommunications
US West claimed in Newsweek that it could save $100M US by 2010 as a result of using EA-based network design system (but this won’t be true, for interesting reasons)
Cox Associates saving millions for cellular clients using GAs (Davis, Cox, et al 1998)
Some published success: Design
GE designed a better turbine using ENGENEOUS
Chuck Karr’s air-injected hydrocyclone
Semiconductor design using GAs
Compiler code evolved using an EA at Microsoft
Molecular design using GAs to predict conformation
Genetic Algorithms (GAs) and
Genetic Programming (GP)
Genetic Algorithms
Optimising parameters for problem solving
Fix a format for the parameters in a problem solution
Represent the parameters in the solution(s)
As a “bit” string normally, but often something else
Evolve answers in this representation
Genetic Programming
Representation of solutions is richer in general
Solutions can be interpreted as programs
Evolutionary process is very similar
Genetic algorithms
They are good when your search technique seems to be going no where!
They are good when your goal seems to have changed.
References
Bio-Inspired Artificial Intelligence by D. Floreano and C. Mattiussi, The MIT Press, 2008.
Questions & Suggestions?
The End
94
Appendix
Example: The Knapsack Problem
You are going on an overnight hike and have a number of items that you could take along. Each item has a weight (in pounds) and a benefit or value to you on the hike (for measurements sake let’s say, in US dollars), and you can take one of each item at most. There is a capacity limit on the weight you can carry (constraint). This problem only illustrates one constraint, but in reality there could be many constraints including volume, time, etc.
Item: 1 2 3 4 5 6 7
Benefit: 5 8 3 2 7 9 4
Weight: 7 8 4 10 4 6 4
Knapsack holds a maximum of 22 pounds
Fill it to get the maximum benefit
Solutions take the form of a string of 1’s and 0’s
Solutions: Also known as strings of genes called Chromosomes
1. 0101010
2. 1101100
3. 0100111
GA Example: The Knapsack Problem
Knapsack Example
Typically, a string of 1s and 0s can represent a solution.
Possible solutions generated by the system using Reproduction, Crossover, or Mutations
1. 0101010
2. 1101100
3. 0100111
Knapsack Example
Solution 1
Benefit 8 + 2 + 9 = 19
Weight 8 + 10 + 6 = 24
Item 1 2 3 4 5 6 7
Solution 0 1 0 1 0 1 0
Benefit 5 8 3 2 7 9 4
Weight 7 8 4 10 4 6 4
Knapsack Example
Solution 2
Benefit 5 + 8 + 7 = 20
Weight 7 + 8 + 4 = 19
Item 1 2 3 4 5 6 7
Solution 1 1 0 0 1 0 0
Benefit 5 8 3 2 7 9 4
Weight 7 8 4 10 4 6 4
Knapsack Example
Solution 3
Benefit 8 + 7 + 9 + 4 = 28
Weight 8 + 4 + 6 + 4 = 22
Item 1 2 3 4 5 6 7
Solution 0 1 0 0 1 1 1
Benefit 5 8 3 2 7 9 4
Weight 7 8 4 10 4 6 4
Knapsack Example
Solution 3 is clearly the best solution and has met our conditions, therefore, item number 2, 5, 6, and 7 will be taken on the hiking trip. We will be able to get the most benefit out of these items while still having weight equal to 22 pounds.
This is a simple example illustrating a genetic algorithm approach.
The Knapsack Problem
The knapsack problem, though simple, has many important applications including determining what items to take on a space shuttle mission.
Another Problem
The Maze search as a genetic algorithm….
zeus.csci.unt.edu/swigger/csce3210/genetic-maze.doc
Baldwin Effect
In biology, the “Baldwin effect” is the result of the interaction of evolution with learning by individual animals over their lifetime. It turns out that individual learning can enhance evolutionary learning at the species level.
The effect is named after J. Mark Baldwin, who described it in 1896.
GA Discussion
Reported to work well on some problems.
Typically not compared with other approaches, e.g. hill-climbing with restarts.
Opinion: Works if the “mating” operator captures good substructures.
Any ideas for GA on TSP?
A good site example
http://www.obitko.com/tutorials/genetic-algorithms/tsp-example.php
Natural vs. Artificial Evolution
Natural evolution does not have a predefined goal and is essentially an open-ended adaptation process.
Artificial evolution is an optimization process that attempts to find solutions to predefined problems.
GA: An example
Optimization problem:
Maximize the function x2 on the integer interval [0, 31]
Traditional methods: we would be tempted to twiddle with the parameter x, turning it like the vertical hold knob on a television set, until we reach the highest objective function value.
GA: the first step of our optimization process is to code the parameter x as a finite-length string. (There are many ways to code the x parameter. At the moment, let’s consider an optimization problem where the coding comes a bit more naturally.)
109
Genetic Algorithms
A simulation by hand
Maximize the function f(x) = x2 on the integer interval [0, 31]
code the decision variables of our problem as some
finite-length string.
For this problem, we will code the variable x simply as a
binary unsigned integer of length 5.
110
Genetic Algorithms
Briefly review the notion of a binary integer
base 10:
base 10 of binary number 10,011
With a five-bit (binary digit) unsigned integer we can obtain numbers between 0 and 31 is 00000 – 11111
111
Genetic Algorithms
String No. Initial Population (Randomly Generated) x value (Unsigned Integer) Expected count Actual Count (from Roulette Wheel)
1 01101 13 169 0.14 0.58 1
2 11000 24 576 0.49 1.97 2
3 01000 8 64 0.06 0.22 0
4 10011 19 361 0.31 1.23 1
Sum 1170 1.00 4.00 4.0
Average 293 0.25 1.00 1.0
Max 576 1.97 2.0
112
Genetic Algorithms
Mating Pool after reproduction (Cross Site Shown) Mate (Randomly Selected) Crossover Site (Randomly Selected) New Population x value f (x)
X^2
0 1 1 0 | 1 2 4 0 1 1 0 0 12 144
1 1 0 0 | 0 1 4 1 1 0 0 1 25 625
1 1 | 0 0 0 4 2 1 1 0 1 1 27 729
1 0 | 0 1 1 3 2 1 0 0 0 0 16 256
Sum 1754
Average 439
Max 729
113
A Simple Example
“The Gene is by far the most sophisticated program around.”
– Bill Gates, Business Week, June 27, 1994
A Simple Example
The Traveling Salesman Problem (TSP):
Find a tour of a given set of cities so that
each city is visited only once
the total distance traveled is minimized
Representation
Representation is an ordered list of city
numbers known as an order-based GA.
1) London 3) Dunedin 5) Beijing 7) Tokyo
2) Venice 4) Singapore 6) Phoenix 8) Victoria
CityList1 (3 5 7 2 1 6 4 8)
CityList2 (2 5 7 6 8 1 3 4)
Crossover
Crossover combines inversion and
recombination:
* *
Parent1 (3 5 7 2 1 6 4 8)
Parent2 (2 5 7 6 8 1 3 4)
Child (5 8 7 2 1 6 3 4)
This operator is called the Order1 crossover.
Mutation involves reordering of the list:
* *
Before: (5 8 7 2 1 6 3 4)
After: (5 8 6 2 1 7 3 4)
Mutation
TSP Example: 30 Cities
Solution i (Distance = 941)
Solution j(Distance = 800)
Solution k(Distance = 652)
Best Solution (Distance = 420)
Overview of Performance
Search Space
For a simple function f(x) the search space is one dimensional.
But by encoding several values into the chromosome many dimensions can be searched e.g. two dimensions f(x,y)
Search space an be visualised as a surface or fitness landscape in which fitness dictates height
Each possible genotype is a point in the space
A GA tries to move the points to better places (higher fitness) in the space
Fitness landscapes
Search Space
Obviously, the nature of the search space dictates how a GA will perform
A completely random space would be bad for a GA
Also GA’s can get stuck in local maxima if search spaces contain lots of these
Generally, spaces in which small improvements get closer to the global optimum are good
Selecting Parents
Many schemes are possible so long as better scoring chromosomes more likely selected
Score is often termed the fitness
“Roulette Wheel” selection can be used:
Add up the fitness’s of all chromosomes
Generate a random number R in that range
Select the first chromosome in the population that – when all previous fitness’s are added – gives you at least the value R
Example population
No. Chromosome Fitness
1 1010011010 1
2 1111100001 2
3 1011001100 3
4 1010000000 1
5 0000010000 3
6 1001011111 5
7 0101010101 1
8 1011100111 2
Roulette Wheel Selection
1
2
3
1
3
5
1
2
0
18
2
1
3
4
5
6
7
8
Rnd[0..18] = 7
Chromosome4
Parent1
Rnd[0..18] = 12
Chromosome6
Parent2
When to Use a GA
Alternate solutions are too slow or overly complicated
Need an exploratory tool to examine new approaches
Problem is similar to one that has already been successfully solved by using a GA
Want to hybridize with an existing solution
Benefits of the GA technology meet key problem requirements
Many Variants of GA
Different kinds of selection (not roulette)
Tournament
Elitism, etc.
Different recombination
Multi-point crossover
3 way crossover etc.
Different kinds of encoding other than bitstring
Integer values
Ordered set of symbols
Different kinds of mutation
Many parameters to set
Any GA implementation needs to decide on a number of parameters: Population size (N), mutation rate (m), crossover rate (c)
Often these have to be “tuned” based on results obtained – no general theory to deduce good values
Typical values might be: N = 50, m = 0.05, c = 0.9
Why does crossover work?
A lot of theory about this and some controversy
Holland introduced “Schema” theory
The idea is that crossover preserves “good bits” from different parents, combining them to produce better solutions
A good encoding scheme would therefore try to preserve “good bits” during crossover and mutation
Genetic Programming
When the chromosome encodes an entire program or function itself this is called genetic programming (GP)
In order to make this work encoding is often done in the form of a tree representation
Crossover entials swaping subtrees between parents
Genetic Programming
It is possible to evolve whole programs like this but only small ones. Large programs with complex functions present big problems
Implicit fitness functions
Most GA’s use explicit and static fitness function (as in our “oil” example)
Some GA’s (such as in Artificial Life or Evolutionary Robotics) use dynamic and implicit fitness functions – like “how many obstacles did I avoid”
In these latter examples other chromosomes (robots) effect the fitness function
Problem
In the Travelling Salesman Problem (TSP) a salesman has to find the shortest distance journey that visits a set of cities
Assume we know the distance between each city
This is known to be a hard problem to solve because the number of possible routes is N! where N = the number of cities
There is no simple algorithm that gives the best answer quickly
Problem
Design a chromosome encoding, a mutation operation and a crossover function for the Travelling Salesman Problem (TSP)
Assume number of cities N = 10
After all operations the produced chromosomes should always represent valid possible journeys (visit each city once only)
There is no single answer to this, many different schemes have been used previously
095
,
53
1
5
10
9
10
0
10
3
10
5
1
2
3
4
=
×
+
×
+
×
+
×
+
×
19
1
2
16
2
1
2
1
2
0
2
0
2
1
0
1
2
3
4
=
+
+
=
×
+
×
+
×
+
×
+
×
)
(
x
f
2
x
i
pselect
å
f
f
i
f
f
i
0
20
40
60
80
100
120
0
10
20
30
40
50
60
70
80
90
100
y
x
Sheet: TSP 30 Dataset
Sheet: Gen N
Sheet: Best
Sheet: Overview
Sheet: Fitness Examples
Sheet: helix
Sheet: Sheet4
Sheet: Sheet5
Sheet: Sheet6
Sheet: Sheet7
Sheet: Sheet8
Sheet: Sheet9
Sheet: Sheet10
Sheet: Sheet11
Sheet: Sheet12
Sheet: Sheet13
Sheet: Sheet14
Sheet: Sheet15
Sheet: Sheet16
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
Gen#
Best
Worst
Average
1315.408333
1097.120833
1012.996667
934.653333
863.815833
750.788333
704.605833
666.631667
634.394167
605.625833
582.119167
560.565833
521.429167
501.803333
466.594167
450.919167
439.516667
430.933333
424.671667
420.926667
4.0
2.0
25.0
24.0
44.0
45.0
41.0
82.0
58.0
62.0
91.0
83.0
71.0
64.0
87.0
74.0
58.0
83.0
71.0
68.0
54.0
54.0
18.0
22.0
18.0
13.0
25.0
37.0
41.0
7.0
50.0
99.0
38.0
42.0
35.0
21.0
26.0
7.0
35.0
32.0
38.0
46.0
44.0
60.0
76.0
78.0
69.0
69.0
71.0
58.0
62.0
67.0
40.0
60.0
54.0
40.0
62.0
84.0
94.0
64.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0
20
40
60
80
100
120
0
10
20
30
40
50
60
70
80
90
100
y
x
TSP30 (Performance = 941)
Sheet: TSP 30 Dataset
Sheet: Gen N
Sheet: Best
Sheet: Overview
Sheet: Fitness Examples
Sheet: helix
Sheet: Sheet4
Sheet: Sheet5
Sheet: Sheet6
Sheet: Sheet7
Sheet: Sheet8
Sheet: Sheet9
Sheet: Sheet10
Sheet: Sheet11
Sheet: Sheet12
Sheet: Sheet13
Sheet: Sheet14
Sheet: Sheet15
Sheet: Sheet16
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
Gen#
Best
Worst
Average
1315.408333
1097.120833
1012.996667
934.653333
863.815833
750.788333
704.605833
666.631667
634.394167
605.625833
582.119167
560.565833
521.429167
501.803333
466.594167
450.919167
439.516667
430.933333
424.671667
420.926667
4.0
2.0
25.0
24.0
44.0
45.0
41.0
82.0
58.0
62.0
91.0
83.0
71.0
64.0
87.0
74.0
58.0
83.0
71.0
68.0
54.0
54.0
18.0
22.0
18.0
13.0
25.0
37.0
41.0
7.0
50.0
99.0
38.0
42.0
35.0
21.0
26.0
7.0
35.0
32.0
38.0
46.0
44.0
60.0
76.0
78.0
69.0
69.0
71.0
58.0
62.0
67.0
40.0
60.0
54.0
40.0
62.0
84.0
94.0
64.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
44
62
69
67
78
64
62
54
42
50
40
40
38
21
35
67
60
60
40
42
50
99
0
20
40
60
80
100
120
0
10
20
30
40
50
60
70
80
90
100
y
x
TSP30 (Performance = 800)
Sheet: TSP 30 Dataset
Sheet: Gen N
Sheet: Best
Sheet: Overview
Sheet: Fitness Examples
Sheet: helix
Sheet: Sheet4
Sheet: Sheet5
Sheet: Sheet6
Sheet: Sheet7
Sheet: Sheet8
Sheet: Sheet9
Sheet: Sheet10
Sheet: Sheet11
Sheet: Sheet12
Sheet: Sheet13
Sheet: Sheet14
Sheet: Sheet15
Sheet: Sheet16
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
Gen#
Best
Worst
Average
1315.408333
1097.120833
1012.996667
934.653333
863.815833
750.788333
704.605833
666.631667
634.394167
605.625833
582.119167
560.565833
521.429167
501.803333
466.594167
450.919167
439.516667
430.933333
424.671667
420.926667
4.0
2.0
25.0
24.0
44.0
45.0
41.0
82.0
58.0
62.0
91.0
83.0
71.0
64.0
87.0
74.0
58.0
83.0
71.0
68.0
54.0
54.0
18.0
22.0
18.0
13.0
25.0
37.0
41.0
7.0
50.0
99.0
38.0
42.0
35.0
21.0
26.0
7.0
35.0
32.0
38.0
46.0
44.0
60.0
76.0
78.0
69.0
69.0
71.0
58.0
62.0
67.0
40.0
60.0
54.0
40.0
62.0
84.0
94.0
64.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0
20
40
60
80
100
120
0
10
20
30
40
50
60
70
80
90
100
y
x
TSP30 (Performance = 652)
Sheet: TSP 30 Dataset
Sheet: Gen N
Sheet: Best
Sheet: Overview
Sheet: Fitness Examples
Sheet: helix
Sheet: Sheet4
Sheet: Sheet5
Sheet: Sheet6
Sheet: Sheet7
Sheet: Sheet8
Sheet: Sheet9
Sheet: Sheet10
Sheet: Sheet11
Sheet: Sheet12
Sheet: Sheet13
Sheet: Sheet14
Sheet: Sheet15
Sheet: Sheet16
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
Gen#
Best
Worst
Average
1315.408333
1097.120833
1012.996667
934.653333
863.815833
750.788333
704.605833
666.631667
634.394167
605.625833
582.119167
560.565833
521.429167
501.803333
466.594167
450.919167
439.516667
430.933333
424.671667
420.926667
4.0
2.0
25.0
24.0
44.0
45.0
41.0
82.0
58.0
62.0
91.0
83.0
71.0
64.0
87.0
74.0
58.0
83.0
71.0
68.0
54.0
54.0
18.0
22.0
18.0
13.0
25.0
37.0
41.0
7.0
50.0
99.0
38.0
42.0
35.0
21.0
26.0
7.0
35.0
32.0
38.0
46.0
44.0
60.0
76.0
78.0
69.0
69.0
71.0
58.0
62.0
67.0
40.0
60.0
54.0
40.0
62.0
84.0
94.0
64.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
42
38
35
26
21
35
32
7
38
46
44
58
60
69
76
78
71
69
67
62
84
94
0
20
40
60
80
100
120
0
10
20
30
40
50
60
70
80
90
100
y
x
TSP30 Solution (Performance = 420)
Sheet: TSP 30 Dataset
Sheet: Gen N
Sheet: Best
Sheet: Overview
Sheet: Fitness Examples
Sheet: helix
Sheet: Sheet4
Sheet: Sheet5
Sheet: Sheet6
Sheet: Sheet7
Sheet: Sheet8
Sheet: Sheet9
Sheet: Sheet10
Sheet: Sheet11
Sheet: Sheet12
Sheet: Sheet13
Sheet: Sheet14
Sheet: Sheet15
Sheet: Sheet16
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
Gen#
Best
Worst
Average
1315.408333
1097.120833
1012.996667
934.653333
863.815833
750.788333
704.605833
666.631667
634.394167
605.625833
582.119167
560.565833
521.429167
501.803333
466.594167
450.919167
439.516667
430.933333
424.671667
420.926667
4.0
2.0
25.0
24.0
44.0
45.0
41.0
82.0
58.0
62.0
91.0
83.0
71.0
64.0
87.0
74.0
58.0
83.0
71.0
68.0
54.0
54.0
18.0
22.0
18.0
13.0
25.0
37.0
41.0
7.0
50.0
99.0
38.0
42.0
35.0
21.0
26.0
7.0
35.0
32.0
38.0
46.0
44.0
60.0
76.0
78.0
69.0
69.0
71.0
58.0
62.0
67.0
40.0
60.0
54.0
40.0
62.0
84.0
94.0
64.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0
200
400
600
800
1000
1200
1400
1600
1800
1
3
5
7
9
11
13
15
17
19
21
23
25
27
29
31
D
i
s
t
a
n
c
e
Generations (1000)
TSP30
–
Overview of Performance
Best
Worst
Average
Sheet: TSP 30 Dataset
Sheet: Gen N
Sheet: Best
Sheet: Overview
Sheet: Fitness Examples
Sheet: helix
Sheet: Sheet4
Sheet: Sheet5
Sheet: Sheet6
Sheet: Sheet7
Sheet: Sheet8
Sheet: Sheet9
Sheet: Sheet10
Sheet: Sheet11
Sheet: Sheet12
Sheet: Sheet13
Sheet: Sheet14
Sheet: Sheet15
Sheet: Sheet16
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
Gen#
Best
Worst
Average
1315.408333
1097.120833
1012.996667
934.653333
863.815833
750.788333
704.605833
666.631667
634.394167
605.625833
582.119167
560.565833
521.429167
501.803333
466.594167
450.919167
439.516667
430.933333
424.671667
420.926667
4.0
2.0
25.0
24.0
44.0
45.0
41.0
82.0
58.0
62.0
91.0
83.0
71.0
64.0
87.0
74.0
58.0
83.0
71.0
68.0
54.0
54.0
18.0
22.0
18.0
13.0
25.0
37.0
41.0
7.0
50.0
99.0
38.0
42.0
35.0
21.0
26.0
7.0
35.0
32.0
38.0
46.0
44.0
60.0
76.0
78.0
69.0
69.0
71.0
58.0
62.0
67.0
40.0
60.0
54.0
40.0
62.0
84.0
94.0
64.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
/docProps/thumbnail.jpeg