Evolutionary Computational Intelligence

Download Evolutionary Computational Intelligence

Post on 30-Dec-2015




1 download

Embed Size (px)


Evolutionary Computational Intelligence. Lecture 9: Noisy Fitness. Ferrante Neri University of Jyvskyl. Real world optimization problems. Many real world optimization problems are characterized by uncertainties - PowerPoint PPT Presentation


<ul><li><p>Evolutionary Computational IntelligenceLecture 9: Noisy FitnessFerrante NeriUniversity of Jyvskyl </p></li><li><p>Real world optimization problemsMany real world optimization problems are characterized by uncertaintiesThis means that the same solutions takes different fitness values on the basis of the time when it is calculated</p></li><li><p>Classification of uncertaintiesUncertainties in optimization can be categorized into three classes. Noisy fitness functionApproximated fitness function Robustness</p></li><li><p>Noisy FitnessNoise in fitness evaluations may come from many different sources such as sensory measurement errors or randomized simulations. Example: optimization based on expereimental setup. Motor drive</p></li><li><p>Approximated Fitness FunctionWhen the fitness function is very expensive to evaluate, or an analytical fitness function is not available, approximated fitness functions are often used instead. These approximated models implicitly introduce a noise which is the difference between the approximated value and real fitness value, which is unknown.</p></li><li><p>Perturbation in the environmentOften, when a solution is implemented, the design variables or the environmental parameters are subject to perturbations or changesExample satellite problem: due to the movement of the earth we are having some changes in the fitness values of the same solution</p></li><li><p>General formulation of uncertain problemA classical formulation of a noisy/uncertain fitness is given by:</p><p>We are not really interested in the fact the noise is Gaussian but it is fundamental that the noise is zero mean!!</p></li><li><p>Zero mean: Explicit AveragingIf the noise is zero mean, it is true that the average over a certain number of samples generates a good estimation of the actual fitness valuesThus, the most classical approach tends to compute the fitness each solution a certain number of times (samples) and then calculate the average</p></li><li><p>Failing of deterministic algorithmsThe noise introduces some false optima in the fitness landscape and obviously a method which employs implicit or explicit information about the gradient can likely failThe estimation of the neighborhood cannot be properly done because the search is misled by the noise</p></li><li><p>Better success of EAsEvolutionary algorithms, due to their inner structure, so not perform comparison among neighbors and thus showed to be better performing in noisy environmentSome recent papers are in fact stating that even rather standard EAs (e.g. self-adaptive ES) can lead to good results in noisy environment </p></li><li><p>Not universal success of EAsThis success is only restricted to specific cases and it strongly depends on the problem under examinationEAs, like all the optimization algorithms, contain some comparison amongst solutions in order to determine which one is better and which one is worseIn EAs this role is given to parent and survivor selection</p></li><li><p>Population based: Implicit AveragingEAs are population based algorithms thus another kind of averaging can be carried outMany scientists observed that large population size is efficient in defeating the noise since it is given a chance to calculate several neighbor solutions and thus detect promising areas</p></li><li><p>Another kind of averagingExplicit and Implicit Averaging are in the class of averaging over the timeBranke proposed averaging over the space:to calculate the fitness by averaging over the neighborhood of the point to be evaluated Implicit assumption: the noise in the neighborhood has the same characteristics as the noise at the point to be evaluated, and that the fitness landscape is locally smooth. This is not always true!!! E.g. systems with instable regions</p></li><li><p>High computational costIt is clear that an averaging operation (most of all over the time), requires extra fitness evaluations and thus an increase of computational overheadIn some cases, in order to have reliable results it is necessary to spend a lot of efforts </p></li><li><p>Adaptive Averagingexample:</p><p>Explicit</p><p>Inplicit</p></li><li><p>Prudent-daring survivor selectionIf not the individuals are re-sampled I can applyTwo cooperative selection schemesPrudent: selects individuals which are reliable (re-sampled) and fairly promisingDaring: selects individuals which are unreliable (fitness calculated only once) but look very promisingReliable solution + computational saving </p></li><li><p>Adaptive Prudent Daring Evolutionary Algorithm</p></li><li><p>APDEA Results</p></li><li><p>Tolerance Interval 1/2noise is Gaussian and that its standard deviation has the same constant valuetolerance interval in the case of Gaussian distribution</p></li><li><p>Tolerance Interval 2/2If solution A is better than B of quantity equal to the half of the width of the tolerance interval, it is surely better (with a certain confidence level)If the distance in the fitness is smaller, then a re-sampling is required </p></li><li><p>Adaptive Tolerant Evolutionary Algorithm</p></li><li><p>Comparison APDEA vs. ATEA</p></li><li><p>Comparative analysisAPDEA is more general since it requires only that the noise is zero meanAPDEA is better performing in terms of convergence velocityATEA requires a preliminary analysisAPDEA requires a more extensive parameter setting</p></li></ul>