DIFFERENTIAL EVOLUTION
DR H.K. LAM
Department of Engineering
King’s College London
Office S2.14, Strand Building, Strand Campus
Email: hak-keung.lam@kcl.ac.uk
https://nms.kcl.ac.uk/hk.lam
Nature-Inspired Learning Algorithms (7CCSMBIM)
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 1/46
Outline
1 Introduction
2 Basic Differential Evolution
DE/x/y/z
3 Variations to Basic Differential Evolution
Switching DE strategies
Hybrid DE strategies
Gradient-Based Hybrid Differential Evolution Evolutionary Algorithm-Based Hybrids Particle Swarm Optimization Hybrids Self-Adaptive Differential Evolution
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 2/46
Learning Objectives
To get the concept of Differential Evolution and know how it works. To get an idea of various versions of Differential Evolution.
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 3/46
Introduction
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 4/46
Introduction
Differential Evolution (DE) is a stochastic, population-based search strategy developed by Storn and Price in 1995.
DE strategy is a branch of Evolutionary Algorithms (EA).
The main difference is that distance and direction information (using
difference vectors) of the population is used to guide the search process. It was originally developed for continuous-valued landscapes.
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 5/46
Introduction
Notation
xi(t) = [xi1,··· ,xinx ]: the ith individual in the population. nx: number of elements in each individual.
ns: size of population (number of particles in the swarm). C(t): population in the tth generation/iteration.
ui(t): trial vector
xi1 : target vector, i ̸= i1
xi2 (t)−xi3 (t): difference vector, i ̸= i1 ̸= i2 ̸= i3; i2. i3 ∼ U(1,ns)
xi2 (t), xi3 (t): 2 randomly selected individuals
β ∈ (0, ∞): scale factor
UI(1,ns) ∈ {1,2,…,ns}: a random integer variable in the range of 1 and ns
U(0, 1) ∈ [0, 1]: a uniform random variable in the range of 0 and 1.
pr ∈ [0, 1]: probability of crossover/recombination
xmin = xmin,1 xmin,2 ··· xmin,nx : lower search boundary
xmax = xmax,1 xmax,2 ··· xmax,nx : upper search boundary
xˆ(t): the best individual from the population at generation t
yˆ(t) = [yˆ1(t),··· ,yˆnx (t)]: the global best position since the first generation.
xmin = [x1min ,··· ,xnxmin ]: a vector of constants denoting the lower bound of xi(t).
: a vector of constants denoting the upper bound of xi(t). r1j(t), r2j(t) ∈ [0,1]: a random number.
t: iteration/generation number
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 6/46
xmax = x1
,··· ,xn
vi = [vi1,··· ,vinx ]: a velocity vector.
max
x max
Basic Differential Evolution
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 7/46
Basic Differential Evolution
Initialise parameters β and pr, counter t, and population C(0)
Evaluation of cost
Mutation
no Recombination (Crossover)
Selection
Stopping criteria met? yes
Figure 1: Flowchart of basic differential evolution.
Return the best x(t) with the best cost
DrH.K.Lam (KCL)
DifferentialEvolution
NILAs2020-21 8/46
Basic Differential Evolution
Basic Components
Population: it is a group of potential solution
Mutation: it produces a trial vector for each selected individual by mutating a target vector with a weighted differential vector.
Crossover: it produces offspring by applying crossover operation to the trial vector produced by the mutation operation.
Selection: It determines if the parent or the offspring will survive to the next generation.
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 9/46
Basic Differential Evolution
Differences between Differential Evolution and other evolutionary algorithms
Mutation is applied first before crossover. Mutation generates a trial vector which is then used within the crossover operator to produce one offspring.
Mutation “step sizes” are not sampled from a prior known probability distribution function (for example, normal distribution) but are influenced by differences between individuals of the current population (difference vectors).
Crossover involves one single parent (individual) and its trial vector. Crossover generates one offspring only.
Each parent (individual) will have its offspring.
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 10/46
Basic Differential Evolution
Population
x x x ··· x 1 1112 1nx
x2 x21 x22 ··· x2nx . = . . .. .
. . . . . xns xns1 xns2 ··· xnsnx
where ns is the size of the population and nx is number of decision variables in an
individual.
C(t) denotes the population at the t
th
x1(t) x2(t)
generation/iteration, e.g., C(t) = . .
xns (t)
DrH.K.Lam (KCL)
DifferentialEvolution NILAs2020-21 11/46
Basic Differential Evolution
Selection for mutation
1. Random selection for target/difference vectors
2. The best individual is selected as the target vector.
Selection for the next population: The offspring replaces the parent if the fitness of the offspring is better than its parent.
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 12/46
Basic Differential Evolution
Mutation for each parent
ui(t)=xi1(t)+β(xi2(t)−xi3(t))
where
ui(t): trial vector
xi1 : target vector, i ̸= i1
xi2(t)−xi3(t): differencevector,i̸=i1 ̸=i2 ̸=i3;i2,i3 ∼UI(1,ns) xi2 (t), xi3 (t): 2 randomly selected individuals
β ∈ (0, ∞): scale factor
UI(1,ns): a random integer variable in the range of 1 and ns
More than one difference vector can be used.
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 13/46
3à2DE x y z 243 Basic Differential Evolution
xi2
x∗
xi t
xi3 t
xi t
ui t
−xi3 t
xi
xi2 t
xi2 t−xi3 t
βxi2 t−xi3 t
Figure 2: Mutation operation with beta = 1.5.
xi2
a Mutation Operator
DrH.K.Lam (KCL)
DifferentialEvolution
NILAs2020-21
14/46
Basic Differential Evolution
Difference vectors
It provides information of the positions of individuals about the fitness of the landscape.
Large distance between individuals: individuals should make large step sizes in order to explore as much of the search space as possible.
Small distance between individuals: individuals should make step sizes small to exploit local areas.
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 15/46
Basic Differential Evolution
Crossover: It produces an offspring x′(t) by implementing a discrete recombination of the trial vector u(t) and the parent vector xi(t).
uij(t) if j ∈ J xi′j(t) = xij(t) otherwise
where
xij(t): the jth element of the parent vector xi(t), i = 1, 2, …, ns, j = 1, 2, …, nx xi′j(t): the jth element of the offspring vector x′i(t), i = 1, 2, …, ns, j = 1, 2, …, nx uij(t): the jth element of the trial vector x′i(t), i = 1, 2, …, ns, j = 1, 2, …, nx
J: the set of element indices that will undergo crossover (the set of crossover points)
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 16/46
Basic Differential Evolution
Crossover: It produces an offspring x′(t) by implementing a discrete recombination of the trial vector u(t) and the parent vector xi(t).
uij(t) if j ∈ J xi′j(t) = xij(t) otherwise
where
xij(t): the jth element of the parent vector xi(t), i = 1, 2, …, ns, j = 1, 2, …, nx xi′j(t): the jth element of the offspring vector x′i(t), i = 1, 2, …, ns, j = 1, 2, …, nx uij(t): the jth element of the trial vector x′i(t), i = 1, 2, …, ns, j = 1, 2, …, nx
J: the set of element indices that will undergo crossover (the set of crossover points)
Example: nx = 3, J = [1,3] ui(t) = ui1(t),ui2(t),ui3(t)
xi(t) = xi1(t),xi2(t),xi3(t) x′i(t) = ui1(t),xi2(t),ui3(t)
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 16/46
Basic Differential Evolution
Binomial crossover: The crossover points are randomly selected from the set of {1, 2, …, nx}
Algorithm 1: Binomial Crossover for Selecting Crossover Points
J ← { };
j∗ ∼ UI(1,nx);
J ← J∪{j∗};
for each j ∈ {1,2,…,nx}\j∗ do
ifU(0,1)
where ε1 > 0 and ε2 > 0 are, respectively, the tolerance for the population diversity and gene
diversity with respect to the best individual, xˆ(t).
x1= x11 x12 ··· x1j ··· x1nx
x2 = x21 x22 ··· x2j ··· x2nx .
xi = xi1 xi2 ··· xij ··· xinx .
xˆ= xˆ1 xˆ2 ··· xˆj ··· xˆnx
xns = xns1 xns2 ··· xnsj ··· xnsnx
The migration operator is applied only when the diversity of the current population becomes too
small, i.e.,
exclude xi(t)̸=xˆ(t)
I11(t)+I12(t)+···Iij(t)···+Insnx (t) < ε1 nx(ns −1)
DrH.K.Lam (KCL)
DifferentialEvolution
NILAs2020-21
40/46
Evolutionary Algorithm-Based Hybrids
Three variations:
1. Use DE reproduction process as a crossover operator in a simple GA.
2. Standard Evolutionary Algorithm mutation operators is used to increase
DE population diversity by adding noise to the created trial vectors.
uij(t) = uij(t)+U(umin,j,umax,j)
where umin,j and umax,j denotes the boundaries of the jth element of the
added noise.
3. Rank-based crossover and mutation operators
Rank-based selection is used to decide which individuals will take part to calculate difference vectors
At each generation, the cost of each individual in the population will be evaluated after crossover and mutation operations.
Individuals are arranged in ascending order: x1, x2, ···, xns where f (x1 ) ≤ f (x2 ) ≤ · · · f (xns ) (assuming minimisation problem) Note: crossover operation is performed before mutation operation.
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 41/46
Evolutionary Algorithm-Based Hybrids
Algorithm 6: Rank-Based Crossover Operator for Differential Evolution
Rank all individuals in ascending order of cost (assuming minimisation);
for i = 1,2,...,ns do r ∼ U(0,1);
x′i(t) = xi(t) + rxi+1(t) − xi(t); if f (x′i(t)) < f′(xi+1(t)) then
xi(t) = xi(t); end
end
When i = ns, use x1(t) as xi+1(t).
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 42/46
Evolutionary Algorithm-Based Hybrids
Algorithm 7: Rank-Based Mutation Operator for Differential Evolution Rank all individuals in ascending order of cost (assuming minimisation);
for i = 1,2,...,ns do
pm,i = ns −i+1; ns
for j = 1,2,...,nx do r1 ∼U(0,1);
if r1 > pm,i then r2 ∼{0,1};
r3 ∼U(0,1); ifr2 =0then
xi′j(t) = xij(t) + (xmax, j − xij)r3e−2t/nt ; else
xi′j(t) = xij(t) − (xij − xmin, j)r3e−2t/nt ; end
end end
end
xi is the xi after crossover
t is the current generation number
nt is the maximum number of generations
r2 ∼ {0, 1} means r2 randomly takes either 0 or 1
Remark: Elitism is implemented, i,e., x1 does not mutate as pm,1 = 1 (r1 > pm,1
will never be satisfied in the above algorithm).
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 43/46
Evolutionary Algorithm-Based Hybrids
Algorithm 8: Rank-Based Differential Evolution
Set the generation counter, t = 0;
Initialize the control parameters, β and pr;
Create and initialise the population†, C(0), of ns individuals; while stopping condition(s) not true do
for each individual, xi ∈ C(t) do
Evaluate the fitness, f (xi(t));
Perform the rank-based crossover operation in Algorithm 6; Perform the rank-based mutation operation in Algorithm 7; if f (x′i(t)) is better than f (xi(t)) then
Add x′i(t) to C(t+1); else
Add xi(t) to C(t+1); end
end
t ← t+1 end
Return the individual with the best fitness as the solution;
DrH.K.Lam (KCL) DifferentialEvolution NILAs2020-21 44/46
Particle Swarm Optimization Hybrids
Two variations:
1. SwitchingbetweenParticleSwarmOptimisationandDifferentialEvolutionstrategies
by sharing the same population.
2. DE-basedPSO:
Update the personal best using DE mutation operation:
yˆij(t)+0.5(z1j(t)−z2j(t)) if U(0,1) ≤ pm y′ij(t) = yij(t) otherwise
wherei=1,2,…,n ;j=1,2,…,n ;y(t)= y y … y denotesthe s xi i1i2 inx
personal best; yˆ (t) = yˆ
yˆ . . . yˆ denotes the local or global best; i2 inx
i
z (t)= z z
1 11 12
i1
… z 1nx
andz (t)= z z
2 21 22
… z
2nx
denotes
the randomly selected personal best positions.
Theoffspringy′i(t)replacethepersonalbestyi(t)iff(y′i(t))