24.01.2014 Views

Algorytmy ewolucyjne

Algorytmy ewolucyjne

Algorytmy ewolucyjne

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Evolutionary Algorithms<br />

evolutionary operators<br />

Piotr Lipiński<br />

lipinski@ii.uni.wroc.pl<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 1/24


Examples of EAs<br />

SSGA<br />

SGA<br />

CGA<br />

PBIL<br />

ECGA<br />

GAs for graphs and objects partitioning<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 2/24


General schema of the EA<br />

EVOLUTIONARY-ALGORITHM(F,N)<br />

1 P ← RANDOM-POPULATION(N);<br />

2 POPULATION-EVALUATION(P,F);<br />

3 while not TERMINATION-CONDITION(P)<br />

4 do<br />

5 EVOLUTIONARY-OPERATOR-1(P);<br />

6 EVOLUTIONARY-OPERATOR-2(P);<br />

7 EVOLUTIONARY-OPERATOR-3(P);<br />

8 ...<br />

9 EVOLUTIONARY-OPERATOR-K(P);<br />

10 POPULATION-EVALUATION(P,F);<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 3/24


Benchmark functions<br />

OneMax – the number of ones in the binary vector<br />

Pattern – the number of places, where the binary vector is identical<br />

with a given pattern<br />

DeceptiveOneMax – the number of ones in the binary vector,<br />

except the binary vector of zeros, where the value is the length of the<br />

vector plus one<br />

K-DeceptiveOneMax – the sum of DeceptiveOneMax over<br />

successive blocks ofK coordinates of the binary vector<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 4/24


SSGA<br />

SIMPLIFIED-SIMPLE-GENETIC-ALGORITHM(F,N,M)<br />

1 P ← RANDOM-POPULATION(N);<br />

2 POPULATION-EVALUATION(P,F);<br />

3 while not TERMINATION-CONDITION(P)<br />

4 do<br />

5 BLOCK-SELECTION(P,M);<br />

6 UNIFORM-CROSSOVER(P);<br />

7 POPULATION-EVALUATION(P,F);<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 5/24


SGA<br />

SIMPLE-GENETIC-ALGORITHM(F,N,M,θ C ,θ M )<br />

1 P ← RANDOM-POPULATION(N);<br />

2 POPULATION-EVALUATION(P,F);<br />

3 while not TERMINATION-CONDITION(P)<br />

4 do<br />

5 P (P) ← PARENT-SELECTION(P,M);<br />

6 P (C) ← CROSSOVER(P (P) ,θ C );<br />

7 MUTATION(P (C) ,θ M );<br />

8 REPLACEMENT(P,P (C) );<br />

9 POPULATION-EVALUATION(P,F);<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 6/24


Selection and Reproduction<br />

selection is sometimes not considered as an evolutionary<br />

operator, although it has a strong influence on the algorithm<br />

selection may be applied BEFORE as well as AFTER the other<br />

evolutionary operators<br />

when selection is applied BEFORE the other evolutionary<br />

operators, the process of creating the new population is<br />

referred to as REPRODUCTION<br />

generational EA vs. steady-state EA<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 7/24


Selection and Reproduction<br />

example of a steady-state EA:<br />

- select two parent individuals,<br />

- produce two offspring individuals,<br />

- replace two worst individuals in the population with the two<br />

offspring individuals,<br />

- repeat the above process until the termination criteria,<br />

REMARK: only two individuals concerned in each iteration,<br />

not the entire population,<br />

PROBLEM: how to compare a steady-state EA with a<br />

generational EA (in terms of the computational complexity or<br />

time) ?<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 8/24


Selection and Reproduction<br />

some individuals may be sometimes transferred to the next<br />

population without applying any evolutionary operators,<br />

generation gap is the fraction of the population replaced in<br />

each iteration (e.g. 1.0 when the entire population is replaced),<br />

the best individuals may be sometimes automatically included<br />

in the next population (so-called elitism),<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 9/24


Fitness Function<br />

LetP = {x 1 ,x 2 ,...,x N } be a population.<br />

One may assign to each individual x i ,i = 1,2,...,N a value<br />

f(x i ) =<br />

F(x i )−F min<br />

∑ N<br />

j=1 (F(x j)−F min ) ,<br />

whereF min = min{F(x 1 ),F(x 2 ),...,F(x N )}, called the<br />

fitness value of the individual x i in the population P.<br />

The function f is called the fitness function.<br />

It is easy to see that<br />

0 ≤ f(x i ) ≤ 1 oraz<br />

N∑<br />

f(x j ) = 1.<br />

draft<br />

j=1<br />

Piotr Lipiński – Evolutionary Algorithms – p. 10/24


Selection According to the Fitness<br />

Function<br />

sometimes referred to as the roulette wheel method<br />

for each individual, the fitness value is evaluated and it defines<br />

the probability of selection for this individual<br />

PROBLEM: the domination of some super-individuals in the<br />

first iterations and a weak convergence in the next iterations.<br />

in order to avoid such a problem, fitness function scaling may<br />

be introduced<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 11/24


Fitness Function Scaling<br />

draft<br />

Simple scaling The fitness value of the i-th individual is given<br />

by<br />

f scaled (t) = f original (t)−f worst (t),<br />

where t is the number of iteration and f worst (t) is the worst<br />

objective value found so far<br />

Sigma scaling The fitness value of the i-th individual is given<br />

by<br />

f scaled (t) = f original (t)−( ¯ f(t)−cσ f (t)),<br />

if it is a positive value or 0 otherwise, wherecis a constant<br />

(e.g. 2), f(t) ¯ is the average fitness in the current population and<br />

σ f (t) is the standard deviation.<br />

Piotr Lipiński – Evolutionary Algorithms – p. 12/24


Fitness Function Scaling<br />

Power scaling The fitness value of the i-th individual is given<br />

by<br />

f scaled (t) = (f original (t)) k ,<br />

for k > 0.<br />

Exponential scaling The fitness value of the i-th individual is<br />

given by<br />

f scaled (t) = exp(f original (t)/T),<br />

where T > 0, called the temperature, is descending to 0 in<br />

successive iterations.<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 13/24


Ranking methods<br />

Linear ranking Assuming that the population size is µ, the<br />

worst individual has the rang0, the best one has the rangµ−1.<br />

The probability of selection of the i-th individual is:<br />

P linear (i) =<br />

α+(rank(i)/(µ−1))(β −α)<br />

,<br />

µ<br />

where α and β are parameters, denoting the expected number<br />

of offspring individual produced by the worst and the best<br />

individual, respectively. It is easy to see that<br />

∑ µ−1<br />

i=0 P linear(i) = 1, so α+β = 2, hence1 ≤ β ≤ 2 and<br />

α = 2−β.<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 14/24


Ranking methods<br />

Power ranking C is a normalization coefficient,0 < α < β,<br />

P power (i) = α+(rank(i)/(µ−1))k (β −α)<br />

.<br />

C<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 15/24


Recombination for continuous representations<br />

no recombination – copying the entire chromosome from one<br />

selected parent<br />

Uniform Crossover<br />

x i = b ki<br />

for a randomly chosen k i ∈ {1,2,...,ρ}, u ∈ (0,1)<br />

Discrete Recombination<br />

classic crossover (like in SGA) for vectors of numbers<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 16/24


Recombination for continuous representations<br />

Global Intermediary Recombination each offspring gene is the<br />

average of all the parent genes<br />

ρ∑<br />

x i = 1 ρ<br />

b ki<br />

k=1<br />

Local Intermediary Recombination each offspring gene is the<br />

average of two randomly chosen parents<br />

x i = u i b k1 i +(1−u i )b k2 i<br />

where k 1 ,k 2 ∈ {1,2,...,ρ}, u ∈ (0,1) are randomly chosen,<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 17/24


Recombination for discreet representation<br />

Multi-Point Recombination<br />

Global Discrete Recombination similar to uniform crossover<br />

for binary representation<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 18/24


Intermediary Recombination<br />

with two parents x 1 , x 2 ,<br />

x ′ i = αx 1i +(1−α)x 2i , α ∈ [0,1]<br />

with many parents x 1 , x 2 ,...,x n ,<br />

where ∑ i α i = 1.<br />

x ′ i = α 1 x 1i +α 2 x 2i +...+α n x ni<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 19/24


Other recombinations<br />

Heuristic recombination Letx 2 be not worse than x 1<br />

x ′ = u(x 2 −x 1 )+x 2<br />

where u ∈ [0,1] is a uniform distributed random number.<br />

Simplex Recombination First, randomly select a group of at<br />

least 2 parents. Letx 1 be the best and x 2 be the worst parent.<br />

Next, evaluate the centercof the group of parents and<br />

x ′ = c+(c−x 2 )<br />

Geometric Recombination (it may be generalized to many<br />

parents also)<br />

draft<br />

x ′ = ( √ x 11 x 21 , √ x 12 x 22 ,...)<br />

Piotr Lipiński – Evolutionary Algorithms – p. 20/24


Quadratic Recombination<br />

Letx ij be the j-th gene of the chromosome x i , for<br />

i = 1,2,3,j = 1,2,...,n where n is the chromosome length and<br />

x 4,j = 1 2 · (x2 2j −x 2 3j)f(x 1 )+(x 2 3j −x 2 1j)f(x 2 )+(x 2 1j −x 2 2j)f(x 3 )<br />

(x 2j −x 3j )f(x 1 )+(x 3j −x 1j )f(x 2 )+(x 1j −x 2j )f(x 3 )<br />

is the offspring individual produced by the 3 parent individuals x 1 ,<br />

x 2 ,x 3 .<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 21/24


Hybrid EA with Local Search<br />

Generate the initial population of µ individuals at random.<br />

Perform local search for each individual.<br />

REPEAT<br />

Generate three points P 1 ,P 2 ,P 3 using Global Discrete<br />

Recombination<br />

Use Quadratic Recombination with P 1 ,P 2 ,P 3 to produce P 4<br />

Perform local search forP 4<br />

Add P 1 ,P 2 ,P 3 ,P 4 to the current population and delete 4<br />

selected individuals from the current population (e.g. by<br />

deterministic selection).<br />

draft<br />

UNTIL termination condition<br />

Piotr Lipiński – Evolutionary Algorithms – p. 22/24


Local Search with Random Memorising<br />

Save the best solution in the memory.<br />

Take a randomly chosen saved solution (x old ) where a new<br />

solution x new outperforming the current population is found.<br />

Perform a search in the direction old → new, i.e. in the<br />

direction<br />

x new −x old<br />

||x new −x old ||<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 23/24


Experiments<br />

18 benchmark functions<br />

Population of 30 individuals<br />

50 trials for each benchmark<br />

draft<br />

Piotr Lipiński – Evolutionary Algorithms – p. 24/24

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!