optimization methods aleksey minin saint-petersburg state university student of acophys master...

Post on 21-Dec-2015

217 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Optimization methodsAleksey MininSaint-Petersburg State UniversityStudent of ACOPhys master program (10th semester)

1

Joint Advanced Students School

19.04.23

Applied and COmputational Physics

What is optimization? 19.04.23Joint Advanced Students School

2

Content:

3

Joint Advanced Students School

19.04.23

Applications of optimization

• Advanced engineering designAdvanced engineering design• BiotechnologyBiotechnology• Data analysisData analysis• Environmental managementEnvironmental management• Financial planningFinancial planning• Process controlProcess control• Scientific modeling etcScientific modeling etc

4

Joint Advanced Students School

19.04.23

Global or Local ?

19.04.23Joint Advanced Students School

5

What is global optimization?

•The objective of global optimization is to find the globally best solution of (possibly nonlinear) models, in the (possible or known) presence of multiple local optima.

19.04.23Joint Advanced Students School

6

19.04.23Joint Advanced Students School

7

Branch and bound 19.04.23Joint Advanced Students School

8

19.04.23Joint Advanced Students School

9

Branch and boundScientist are ready to carry out some experiments. But the quality of all of the varies depending on type of experiment according to next table:

Type of experiment

Scientist number

1 2 3 4

A 0.9 0.8 0.9 0.85

B 0.7 0.6 0.8 0.7

C 0.85 0.7 0.85 0.8

D 0.75 0.7 0.75 0.7

19.04.23Joint Advanced Students School

10

Branch and bound

Root

19.04.23Joint Advanced Students School

11

Branch and bound

RootAAAA0.55

19.04.23Joint Advanced Students School

12

Branch and bound

RootAAAA0.55

AADCC0.42

BBAAA0.42

CCAAA0.52

DDAAA0.45

19.04.23Joint Advanced Students School

13

Branch and bound

RootAAAA0.55

AADCC0.42

BBAAA0.42

CCAAA0.52

DDAAA0.45

BCBAA0.39

DCDAA0.45

ACABD0.38

19.04.23Joint Advanced Students School

14

Branch and bound

RootAAAA0.55

AADCC0.42

BBAAA0.42

CCAAA0.52

DDAAA0.45 D

CDAA0.45

BCBAA0.39

ACABD0.38 A

CBAD0.37

BCDBA0.40

19.04.23Joint Advanced Students School

15

Branch and bound

RootAAAA0.55

AADCC0.42

BBAAA0.42

CCAAA0.52

DDAAA0.45 D

CDAA0.45

BCBAA0.39

ACABD0.38

BCDBA0.40

ACBAD0.37

19.04.23Joint Advanced Students School

16

Branch and bound

Evolutionary algorithms

19.04.23Joint Advanced Students School

17

Evolutionary algorithms

19.04.23Joint Advanced Students School

18

Simulated annealing 19.04.23Joint Advanced Students School

19

Apply small

perturbation

Solution found!

If T=0

Repeat until good solution

not found

19.04.23Joint Advanced Students School

20

Simulated annealingresults

19.04.23Joint Advanced Students School

21

Simulated annealing

Tree annealingdeveloped by Bilbro and Snyder [1991]

19.04.23Joint Advanced Students School

22

19.04.23Joint Advanced Students School

23

Tree annealingdeveloped by Bilbro and Snyder [1991]

Swarm intelligence

19.04.23Joint Advanced Students School

24

Tabu Search

19.04.23Joint Advanced Students School

25

Taboo search implementation

19.04.23Joint Advanced Students School

26

1

Tabu search implementation

19.04.23Joint Advanced Students School

27

1

34

5 2

Taboo search implementation

19.04.23Joint Advanced Students School

28

1

34

5 2

1

Tabu search implementation

19.04.23Joint Advanced Students School

29

1

34

5 2

1

7

6

3

Tabu search implementation

19.04.23Joint Advanced Students School

30

1

34

5 2

1

7

6

3

9

8

6

Tabu search implementation

19.04.23Joint Advanced Students School

31

1

34

5 2

1

7

6

3

9

8

6

10

11

9

Tabu search implementation

19.04.23Joint Advanced Students School

32

1

34

5 2

1

7

6

3

9

8

6

10

11

9

Tabu search implementation

19.04.23Joint Advanced Students School

33

1

34

5 2

1

7

6

3

9

8

10

11

9 6

Tabu search implementation

19.04.23Joint Advanced Students School

34

1

34

5 2

1

7

6

3

9

8

10

11

9 6

19.04.23Joint Advanced Students School

35

Tabu Search

What is Local Optimization?

•The term LOCAL refers both to the fact that only information about the function from the neighborhood of the current approximation is used in updating the approximation as well as that we usually expect such methods to converge to whatever local extremum is closest to the starting approximation.

•Global structure of the objective function is unknown to a local method.

19.04.23Joint Advanced Students School

36

19.04.23Joint Advanced Students School

37

Local optimization

Gradient descent

19.04.23Joint Advanced Students School

38

Gradient descent

19.04.23Joint Advanced Students School

39

Therefore we obtained: F(x0)<F(x1)<…<F(xn )

Quasi-Newton Methods

19.04.23Joint Advanced Students School

40

•These methods build up curvature information at eachiteration to formulate a quadratic model problem of the form:

where the Hessian matrix, H, is a positive definite symmetric matrix, c is a constant vector, and b is a constant.•The optimal solution for this problem occurs when the partial derivatives of x go to zero:

Quasi-Newton Methods

19.04.23Joint Advanced Students School

41

BFGS - algorithm 19.04.23Joint Advanced Students School

42

19.04.23Joint Advanced Students School

43

BFGS - algorithm

Gauss Newton algorithm

19.04.23Joint Advanced Students School

44

Given m functions f1 f2 … fm of n parameters p1 p2 .. Pn (m>n),and we want to minimize the sum:

19.04.23Joint Advanced Students School

45

Gauss Newton algorithm

Levenberg-Marquardt 19.04.23Joint Advanced Students School

46

This is an iterative procedure. Initial guess for pT = (1,1,…,1).

19.04.23Joint Advanced Students School

47

Levenberg-Marquardt

19.04.23Joint Advanced Students School

48

SQP – constrained minimization

Reformulation

19.04.23Joint Advanced Students School

49

SQP – constrained minimizationThe principal idea is the formulation of a QP sub-problem based on a quadratic approximation of the Lagrangian function:

19.04.23Joint Advanced Students School

50

SQP – constrained minimizationUpdating the Hessian matrix

19.04.23Joint Advanced Students School

51

SQP – constrained minimizationUpdating the Hessian matrix

Neural Net analysis

What is Neuron?

Typical formal neuron makes the elementary operation – weighs values of the inputs with the locally stored weights and makes above their sum nonlinear transformation:

y f u u w w xi ii , 0

x1 xn

y

u

y

u w w xi i 0

neuron makes nonlinear operation above a linear combination of inputs

19.04.23Joint Advanced Students School

Neural Net analysis

What is training?

W – set of synaptic W – set of synaptic weightsweightsE (W) – error functionE (W) – error function

What kind of optimization to choose?

Joint Advanced Students School

19.04.23

19.04.23Joint Advanced Students School

54

Neural Network – any architecture

1 2 3 4

Error back propagation

1 2

34

5

06

19.04.23Joint Advanced Students School

55

How to optimize?

Objective function – is an Empirical error (should decay)Parameters to optimize - are weightsConstraints – are equalities (inequalities) for weights if exist

19.04.23Joint Advanced Students School

56

Neural Net analysis and constrained and unconstrained minimization

NB! For unconstrained optimization I applied Levenberg-Marquardt methodFor constrained case I applied SQP method

Thank you for your attention

19.04.23Joint Advanced Students School

57

top related