optmization techniques

of 32 /32
OPTIMIZATION TECHNIQUES

Author: deepshika-reddy

Post on 17-Jul-2015

108 views

Category:

Engineering


4 download

Embed Size (px)

TRANSCRIPT

Slide 1

OPTIMIZATION TECHNIQUES 1Definition:An optimization is the act of achieving the best possible result under given circumstances.Primary objective may not be optimize absolutely but to compromise effectively &thereby produce the best formulation under a given set of restrictions .

23Why Optimization is necessary? Innovation & efficiencyHistorical development

Isaac Newton (1642-1727) :The development of differential calculus methods of optimization.Joseph-Louis Lagrange (1736-1813) :Calculus of variations, minimization of functionals, method of optimization for constrained problems.Augustin-Louis Cauchy (1789-1857) :Solution by direct substitution, steepest descent method for unconstrained optimization.George Bernard Dantzig (1914-2005):Linear programming and Simplex method (1947).Albert William Tucker (1905-1995):Necessary and sufficient conditions for the optimal solution of programming problems, nonlinear programming.

OPTIMIZATION PARAMETERSObjective function An objective function expresses the main aim of the model which is either to be minimized or maximized. For example: in a manufacturing process, the aim may be to maximize the profit or minimize the cost.The two exceptions are: No objective function Multiple objective functions.

Variables A set of unknowns or variables control the value of the objective function.

variables can be broadly classified as: Independent variable Dependent variableConstraintsThe restrictions that must be satisfied to produce an acceptable design are collectively called design constraints.

Constraints can be broadly classified as:Behavioral or FunctionalGeometric or Side

6Statement of an optimization problem

An optimization problem can be stated as follows:

To find X = which minimizes f(X)

Subject to the constraints gi(X) 0 , i = 1, 2, ., m lj(X) = 0, j = 1, 2, ., pwhere X is an n-dimensional vector called the design vector, f(X) is called the objective function, and gi(X) and lj(X) are known as inequality and equality constraints, respectively.

Classification of optimization Based on ConstraintsConstrained optimization (Lagrangian method)Unconstrained optimization (Least Squares)

Based on Nature of the design variablesStatic optimization Dynamic optimization Based on Physical structure Optimal control Sub-optimal control

Based on Nature of variables

Stochastic optimization Deterministic optimization

Based On Separability Of The Functions

SeparableNon separable

Based on the Nature of the Equations Involved Linear programming Quadratic programming Nonlinear programming

9Based on the Permissible Values of the Design Variables

Inter programmingReal valued programming

Based on the Number of Objective Functions

Single objectiveMulti objective

Classical OptimizationThe classical methods of optimization are useful in finding the optimum solution of continuous and differentiable functions. classical optimization techniques, can handle 3 types of problems: single variable functionsmultivariable functions with no constraints multivariable functions with both equality and inequality constraintsSingle variable optimization:A single-variable optimization problem is one in which the value of x = x is to be found in the interval [a, b] such that x minimizes f (x). f (x) at x = x is said to have alocal minimum if f (x ) f (x + h) for all small h local maximum if f (x ) f (x + h) for all values of h0Global minimum if f (x ) f (x) for all xGlobal maximum if f (x ) f (x) for all xMULTIVARIABLE OPTIMIZATION WITH NO CONSTRAINTSIt is the minimum or maximum of an unconstrained function of several variablesNecessary ConditionIf f (X) has an extreme point (max or min) at X = X and if the first partial derivatives of f (X) exist at X , then f /x1 (X ) = f/ x2 (X ) = = f /xn (X ) = 0Sufficient ConditionThe Hessian matrix defined by H is made using the second order derivativespositive definite when X is a relative minimum pointnegative definite when X is a relative maximum point.MULTIVARIABLE WITH EQUALITY CONSTRAINTS Minimize f= f(X) Subject to the constraints gi(X) =0 , i = 1, 2, ., mwhere X= Here m n; otherwise (if m > n), the problem becomes over defined and, in general, there will be no solution.There are several methods available for the solution of this problemSuch methods are 1. Direct substitution2 .Constrained variation 3. Lagrange multipliers

Solution by Direct SubstitutionA problem with n variables and m equality constraints, , it is theoretically possible to solve simultaneously the m equality constraints and express any set of m variables in terms of the remaining n m variables. With these new objective unction is obtained.

Drawbacksconstraint equations will be nonlinear for most of practical problems.often it becomes impossible to solve them and express any m variables in terms of the remaining n m variables.

By the Method of Constrained VariationThe basic idea used in the method of constrained variation is to find a closed-form expression for the first-order differential of f (df) at all points at which the constraints gj (X) = 0, j = 1, 2, . . . , m, are satisfied.

Drawback Prohibitive for problems with more than three constraints.

By The Method Of Lagrange MultipliersFor instance consider theoptimization problem maximizef(x1,x2) subject tog(x1,x2) =c. We introduce a new variable () called a Lagrange multiplier and Lagrange function is defined by L(x1, x2, ) = f (x1, x2) + g(x1, x2)By treating L as a function of the three variables x1, x2, and , the necessary conditions for its extreme are given by L/x1(x1, x2, )= f /x1(x1, x2)+ g /x1(x1, x2) = 0 L/x2 (x1, x2, ) = f /x2 (x1, x2) + g/ x2 (x1, x2) = 0 L/ (x1, x2, ) = g(x1, x2) = 0

MULTIVARIABLE OPTIMIZATION WITH INEQUALITY CONSTRAINTSThe inequality constraints can be transformed to equality constraints by adding nonnegative slack variables, y ^2 (j ), as gj (X) + y ^2 (j ) = 0, j = 1, 2, . . . , mwhere the values of the slack variables are yet unknown. The problem now becomes Gj (X, Y) = gj (X) + y ^2 (j ) = 0, j = 1, 2, . . . , mwhere Y = {y1, y2, . . . , ym} T is the vector of slack variablesThis problem can be solved conveniently by the method of Lagrange multipliers. Kuhn-Tucker conditionsConsider the following optimization problem: Minimize f(X) subject to gj(X) 0 for j = 1,2,,p ; where X = [x1 x2 . . . xn] Then the Kuhn-Tucker conditions for X* = [x1 * x2 * . . . xn * ] to be a local minimum are f /xi + gj/xi = 0, i = 1, 2, . . . , n jgj = 0, j = 1, 2, . . . , m gj 0, j = 1, 2, . . . , m j 0, j = 1, 2, . . . , m

CONVEX PROGRAMMING PROBLEM The optimization problem with inequality constraint is called a convex programming problem if the objective function f (X) and the constraint functions gj (X) are convex. A function is convex if its slope is non decreasing or 2 f / x2 0. It is strictly convex if its slope is continually increasing or 2 f / x2 > 0 throughout the function.Concave functionA differentiable function f is concave on an interval if its derivative function f is decreasing on that interval: a concave function has a decreasing slope.

Advanced Optimization TechniquesHill climbing Hill climbing is a graph search algorithm where the current path is extended with a successor node which is closer to the solution than the end of the current path. Simple hill climbing Steepest ascent hill climbingSimulated Annealing In the simulated annealing method, each point of the search space is compared to a state of some physical system, and the function to be minimized is interpreted as the internal energy of the system in that state.Genetic Algorithm: GAs belong to a class of methods called Evolutionary Algorithms (EA) that are inspired by the processes of natural selection.

GAs are different from more traditional optimization techniques because they search from a population of points rather than a single point.

They also use payoff information based on an objective function defined by the user rather than derivatives or other secondary knowledge.

Ant Colony Optimization: An ACO algorithm is an artificial intelligence technique based on the pheromone-laying behavior of ants; it can be used to find solutions to exceedingly complex problems that seek the optimal path through a graph.Ant colony optimization algorithms have been used to produce near-optimal solutions to the traveling salesman problem.

The ant colony algorithm can be run continuously and can adapt to changes in real time.

Optimization In Managerial EconomicsThe objective of business firm is to maximize profits or the value of firm or to maximize cost , subject to some constraints. The value of firm is impacted by Total Revenue Total CostBasic economic relations Functional Relations Total, Average & Marginal Relations Graphing Total, Average & Marginal RelationsOften we wish to optimize but are faced with a constraint. In such case we need lagrangian multiplier.

L=f(X,Z)+[Y-g(X,Z)]

To find the optimal values of x & z, we take derivative of lagrangian w.r.t X,Z & : setting these derivatives to zero.Example: A firm faces following cost function cost=c=f(x,z)= The firm will produce 80 units of x & z, with any mix of x & z being acceptable

Optimization In Pharmaceutical And Processing In pharmacy the word optimization is found in the literature referring to any study of formula.Traditionally, optimization in pharmaceuticals refers to changing one variable at a time, so to obtain solution of a problematic formulation. Modern pharmaceutical optimization involves systematic design of experiments (DoE) to improve formulation irregularities.Constraints: Example: Making hardest tablet but should disintegrate within 20 mins.Unconstraint: Example: Making hardest tablet ( Unconstraint)Independent variable-:E.g: mixing time for a given process step.( granulating time)Dependent variables: which are the responses or the characteristics of the in process material . Eg: Particle size of vesicles, hardness of the tablet.

Statistical Design Divided into two classes:Experimentation continues as the optimization study proceeds. Ex: EVOP and simplex methods.Experimentation is completed before optimization takes place. Ex: Lagrangian method and search methods. The relationship between dependent and independent variables can be estimated by two approaches Theoretical approach. Empirical or experimental approach.

Applications

To study pharmacokinetic parameters.To study process variables in tablet coating operations. In high performance liquid chromatography.Formulation of culture medium in virology labs.Sub micro emulsions with sunscreens using simplex composite designs.

Engineering applications of optimizationDesign of civil engineering structures such as frames, foundations, bridges, towers, chimneys and dams for minimum cost.Design of minimum weight structures for earth quake, wind and other types of random loading.Shortest route taken by a salesperson visiting various cities during one tourOptimum design of electrical networksOptimal plastic design of frame structuresDesign of aircraft and aerospace structure for minimum weightFinding the optimal trajectories of space vehicles.Trajectory OptimizationMinimizing the cost of a space mission is a major concern in the space industry.Trajectory optimization has been developed through classical methods of optimization. However, the application of Genetic Algorithms has become increasingly popular.Objective:The objective of this optimization was to reduce the time of-flight and, as a result, the propellant cost.The Genetic Algorithm used will be responsible for determining the optimal thrust direction or flight path angle at the beginning of each time segment and time-of-flightCONSTRAINTS: Objective is to minimize the TOF and the penalties to this minimization are on the position and velocity of the spacecraft at mars and at Jupiter.By minimizing the time of flight the risk of damage to the satellite during the course of the mission is reduced as well as the cost of fuel.SOLUTION OF OPTIMIZATION PROBLEMS USING MATLABMATLAB is a popular software that is used for the solution of a variety of scientific and engineering problems.The specific toolbox of interest for solving optimization and related problems is called the optimization toolbox.Basically, the solution procedure involves three steps after formulating the optimization problemstep 1

Involves writing an m-file for the objective function.

Step 2

Involves writing an m-file for the constraints.

Step 3

Involves setting the various parameters at proper values depending on the characteristics of the problem and the desired output and creating an appropriate file to invoke the desired MATLAB program.