genetic algorithms a search & optimization tool
DESCRIPTION
Genetic Algorithms A Search & Optimization Tool. Genetic Algorithm. Search and Optimization Algorithm. Based on principles of Natural Selection and Genetics. Proposed by John Holland of University of Michigan in 1975, to study the phenomenon of adaptation as it occurs in nature. - PowerPoint PPT PresentationTRANSCRIPT
Genetic AlgorithmsA Search & Optimization Tool
Genetic Algorithm
• Search and Optimization Algorithm.
• Based on principles of Natural Selection and Genetics.
• Proposed by John Holland of University of Michigan in 1975, to study the phenomenon of adaptation as it occurs in nature.
Problems with Traditional Methods
• Search Space is often complicated and one doesn’t know where to look for the solution or where to start from. Here GA comes to help.
• Traditional methods often require some domain knowledge of the problem which might not be readily available.
• Many traditional methods are often sensitive to initial guesses made and provided an inappropriate guess the method may not converge to the solution.
Natural Selection
• A process called natural selection, ‘selects’ individuals best adapted to the environment.
• Those fittest survive longest.• Characteristics, encoded in genes are transmitted
to offspring and tend to propagate into new generations.
• In sexual reproduction, the chromosomes of offspring are a mix of their parents.
• An offspring’s characteristics are partially inherited from parents and partly the result of new genes created during the reproduction process.
Terminology
Chromosome often encoded as a bit string, represent a candidate solution in the population.
Genes are either single bits or short blocks of adjacent bits that encode a particular element of the candidate solution.
Alleles are 0’s or 1’s in a bit string.
Nature to Computer Mapping
Nature ComputerIndividualPopulationFitnessChromosomeGene
Crossover andMutationNatural Selection
Solution to a problemSet of solutionsQuality of a solutionEncoding for a solutionPart of the encoding of asolutionSearch operators
Reuse of good (sub-)solutions
Elements of a Genetic Algorithm
• A Population of chromosomes.• A Fitness Function.• Genetic Operators
- Selection
- Crossover
- Mutation
Genetic Operators Selection : This operator selects chromosomes in
the population for reproduction.The fitter the chromosome, the more times it is likely to be selected to reproduce.
Crossover : This operator randomly chooses a locus and exchanges the subsequences before and after that locus between two chromosomes to create two offspring.
Mutation :This operator randomly flips some of the bits in a chromosome.
Basic Algorithm
• Initialise and evaluate a population• While (termination condition not met) do
– Select sub-population based on fitness– Produce offspring of the population using
crossover– Mutate offspring stochastically– Select survivors based on fitness
A Sample Example
Problem:
Find the value of x which maximises the function f(x)=x2 on the integer range for x [0..31]
Solution:• Choose a population of 4 individuals (small by GA
standards), values chosen at random.
Sample population: 01101,11000,00100,10011
• Fitness values: 169, 576, 64, 361 (Average 293)
• Reproduction (e.g using a Roulette wheel)– 14.4%, 49.2%, 5.5%, 30.9%
• Selection for Crossover: 01101 and 11000• Crossover after bit 4 0110 0 and 1100 1
• Another selection: 11000 and 10011• Crossover after bit 2 11 011 and 10 000
– In both cases bit position chosen at random
A Sample Example Contd.(1)
• New population after first generation– 01100, 11001, 11011, 10000– Fitness values: 144, 625, 729, 256
Avg. 439– Prev. generation best was 576 and
Avg. 293• Next generation will start with this population
A Sample Example Contd..(2)
A Sample Example Contd…(3)
• Mutation: Change one bit probabilistically – e.g. p_m = 0.001– Expected probability of a mutation of one
individual is 0.005 (no.of bits * p_m)– Expected probability of a mutation in whole
family is 0.02 (since 4 individuals in population)
• Eventually get convergence with best individual
11111 and a fitness of 961
Algorithm (Revisited...)
1. Start with a randomly generated population of n chromosomes each of size m-bits. 2. Calculate the fitness f(x) of each chromosome x in the population. 3. Repeat the following steps:
a) Select a pair of parent chromosomes from the current population.
b) With probability Pc, cross over the pair at a randomly chosen point to form offspring. c) Mutate the two offspring at each locus with probability Pm, and place the resulting chromosomes in the new population4. Replace the current population with n most fit chromosomes.5. Go To step 2
Algorithm contd..
Each iteration of this process is called a Generation (50 to 500).
The entire set of generations is called a Run. At the end of a run there are often one or
more highly fit chromosomes in the population. This simple procedure in fact forms the basis for
most applications of GA. The success of the algorithm depends on various
details like size of the population, probabilities of crossover and mutation.
Search Space
• The set of all possible individuals (solutions) defines the search space.
• One measure of the complexity of the problem is the size of the search space.
• Crossover and mutation implement a pseudo-random walk through the search space.
• Walk is random because crossover and mutation are non-deterministic.
• Walk is directed in the sense that the algorithm aims to maximise quality of solutions using a fitness function which measures the fitness of an individual.
Theoretical Foundations
• John Holland’s Schemata theorem
A schema is a similarity template describing a subset of strings with similarities at certain positions.
• Building Block Hypothesis
Schemata with high fitness and small defining length are called building blocks. Building blocks combine together t form bigger and better BBs and eventually the optimal solution(s).
GAs Vs. other Search & Optimization Methods
• GAs work with a population of candidate solutions and not a single point.
• GAs work with coding of parameters instead of parameters themselves.
• GAs do not require any domain knowledge (gradient information etc.) and just use the payoff information.
• GAs are stochastic methods, i.e., use probabilistic transition rules and not deterministic ones.
• Applies to a variety of problems and not works in a restricted domain.
GAs Vs. other Search & Optimization Methods
• Multiple solutions can be obtained without extra effort.
• GAs are implicitly parallel and can be implemented on parallel machines.
• GAs are quite successful in locating the regions containing optimal solution(s), if not the optimum solution itself.
• GAs can solve problems involving large time domain.
Related Fields
• Evolutionary Strategies• Genetic Programming• Genetic Engineering
Some Applications of GA• Optimization• Automatic Programming• Machine Learning• Economics• Immune systems• Ecology• Population genetics• Evolution and learning• Social systems• Bioinformatics• Neural Networks & Fuzzy Logic
References Books
• Introduction to Genetic Algorithms, M. Mitchell, MIT press, 1996
• Genetic Algorithms in Search, Optimisation and Machine Learning, D.E. Goldberg,
Addison-Wesley 1989 • An introduction to Genetic Algorithms for
Scientists and Engineers, D. A. Coley,
World Scientific 1999.• Optimization for Engineering Design –
Algorithms and Examples, Kalyanmoy Deb,
P.H.I.