evolutionary computational inteliigence

21
1 Evolutionary Computational Inteliigence Lecture 6b: Towards Parameter Control Ferrante Neri University of Jyväskylä

Upload: zamir

Post on 14-Jan-2016

25 views

Category:

Documents


0 download

DESCRIPTION

Evolutionary Computational Inteliigence. Lecture 6b: Towards Parameter Control. Ferrante Neri University of Jyväskylä. Motivation 1. An EA has many strategy parameters, e.g. mutation operator and mutation rate crossover operator and crossover rate - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Evolutionary Computational Inteliigence

1

Evolutionary Computational Inteliigence

Lecture 6b: Towards Parameter Control

Ferrante Neri

University of Jyväskylä

Page 2: Evolutionary Computational Inteliigence

2

Motivation 1

An EA has many strategy parameters, e.g. mutation operator and mutation rate crossover operator and crossover rate selection mechanism and selective pressure (e.g.

tournament size) population size

Good parameter values facilitate good performance

Q1 How to find good parameter values ?

Page 3: Evolutionary Computational Inteliigence

3

Motivation 2

EA parameters are rigid (constant during a run)

BUT

an EA is a dynamic, adaptive process

THUS

optimal parameter values may vary during a run

Q2: How to vary parameter values?

Page 4: Evolutionary Computational Inteliigence

4

Parameter tuning

Parameter tuning: the traditional way of testing andcomparing different values before the “real” runProblems: users mistakes in settings can be sources of errors

or sub-optimal performance costs much time parameters interact: exhaustive search is not

practicable good values may become bad during the run

(e.g. Population size)

Page 5: Evolutionary Computational Inteliigence

5

Parameter Setting: Problems

A wrong parameter setting can lead to an undesirable algorithmic behavious since it can lead to stagnation or premature convergence

Too large population size, stagnation Too small population size, premature convergence In some “moments” of the evolution I would like to

have a large pop size (when I need to explore and prevent premature convergence); in other “moments” I would like to have a small one (when I need to exploit available genotypes)

Page 6: Evolutionary Computational Inteliigence

6

Parameter control

Parameter control: setting values on-line, during theactual run, I would like that the algorithm “decides” by

itself how to properly vary parameter setting over the run

Some popular options for pursuing this aim are: predetermined time-varying schedule p = p(t) using feedback from the search process encoding parameters in chromosomes and rely on natural

selection (similar to ES self-adaptation)

Page 7: Evolutionary Computational Inteliigence

7

Related Problems

Problems: finding optimal p is hard, finding optimal p(t) is harder still user-defined feedback mechanism, how to ”optimize”? when would natural selection work for strategy parameters?

Provisional answer: In agreement with the No Free Lunch Theorem, optimal control

strategy does not exist. Nevertheless, there are a plenty of interesting proposals that can be very performing in some problems. Some of these strategies are very problem oriented while some others are much more robust and thus applicable in a fairly wide spectrum of optimization problems

Page 8: Evolutionary Computational Inteliigence

8

How

Three major types of parameter control:

deterministic: some rule modifies strategy parameter without feedback from the search (based on some counter)

adaptive: feedback rule based on some measure monitoring search progress

self-adaptative: parameter values evolve along with solutions; encoded onto chromosomes they undergo variation and selection

Page 9: Evolutionary Computational Inteliigence

9

Global taxonomy

P A R A M E T E R T U N ING(b e fo re th e ru n)

D E T E R M IN IS T IC(t im e de p en de n t)

A D A P T IVE(fe e db a ck fro m sea rch)

S E L F -A D A P T IVE(co d ed in ch ro m o som e s)

P A R A M E T ER C O N T R O L(d u ring th e ru n)

P A R A M E T E R S E T T ING

Page 10: Evolutionary Computational Inteliigence

10

The parameter changes may be based on: time or nr. of evaluations (deterministic control) population statistics (adaptive control)

– progress made– population diversity– gene distribution, etc.

relative fitness of individuals creeated with given values (adaptive or self-adaptive control)

Evidence informing the change

Page 11: Evolutionary Computational Inteliigence

11

Evidence informing the change

Absolute evidence: predefined event triggers change, e.g. increase pm by 10% if population diversity falls under threshold x

Direction and magnitude of change is fixed Relative evidence: compare values through

solutions created with them, e.g. increase pm if top quality offspring came by high mut. Rates

Direction and magnitude of change is not fixed

Page 12: Evolutionary Computational Inteliigence

12

Deterministic Control

It is based on a priori designed scheme which takes into account the variable time

The general idea is that the algorithm at the beginning of the process has different needs compared to the end of it

The main disadvantage of such an approach is that the algorithmic designer is supposed to know beforehand when the changes in parameter setting must be carried out

Page 13: Evolutionary Computational Inteliigence

13

Deterministic Control

1st example: the population is enlarged at the end of the optimization process (after a prearranged number of generations)

2nd example (Arabas 1994): assign to each solution a lifetime parameter. Fitter individuals must survive longer. The population is thus variable over a certain number of generations

Page 14: Evolutionary Computational Inteliigence

14

Adaptive Control

It is online measured an index to perform the parameter setting of the subsequent generations

Fitness improvements: if the algorithmic response in terms of improvements are relevant, then a parameter is unchanged. When the parameter setting is inefficient then the parameter setting is modified (e.g. mutation rate increased when no improvements are found)

Page 15: Evolutionary Computational Inteliigence

15

Adaptive Control

Fitness diversity: in high diversity conditions the algorithm needs to exploit available genotypes, in low diversity conditions the algorithm needs to detect new genotypes and search directions

A measurement of the diversity can be employed for enlarging population size in lo diversity condition and shrinking in high diversity conditions (analogously for mutation rate)

Page 16: Evolutionary Computational Inteliigence

16

Adaptive Control

1st example:

min 1, best avg

best

f f

f

1f vpop pop popS S S

max 1m mp p

Page 17: Evolutionary Computational Inteliigence

17

Adaptive Control

2nd example

max

1

1

1

avg best

worst best

f vpop pop pop

m m

f f

f f

S S S

p p

Page 18: Evolutionary Computational Inteliigence

18

Adaptive Control

3rd example:

max

min 1,

1

1

f

avg

f vpop pop pop

m m

f

S S S

p p

Page 19: Evolutionary Computational Inteliigence

19

Self-adaptive control

It encodes the control parameter within the genotype of the solution and it is based on the idea that it should evolve with the population

1 2, ,... , _nx x x ctr par

Page 20: Evolutionary Computational Inteliigence

20

Self-adaptive control

If for example the control parameter measures the improvements due to the mutation, it can affect the mutation rate

Another popular option is to employ such a parameter for deciding the replication in the mating pool of the individuals (if the solution looks very promising) in order to have a tailored selection pressure

Page 21: Evolutionary Computational Inteliigence

21

Evaluation / Summary

Parameter control offers the possibility to use appropriate values in various stages of the search

Adaptive and self-adaptive parameter control – offer users “liberation” from parameter tuning– delegate parameter setting task to the evolutionary process– the latter implies a double task for an EA: problem solving +

self-calibrating (overhead)

Robustness, insensivity of EA for variations assumed– If no. of parameters is increased by using (self)adaptation– For the “meta-parameters” introduced in methods