advanced design and analysis techniques - part 1 · 2012-11-06 · techniques -1 • this part...

Post on 05-Jul-2020

3 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Advanced Design and Analysis

TechniquesTechniques

Part 1

15.1 and 15.2

1

2

Techniques -1

• This part covers three important techniques

for the design and analysis of efficient

• algorithms:

– dynamic programming (Chapter 15),– dynamic programming (Chapter 15),

– greedy algorithms (Chapter 16),and

– amortized analysis (Chapter 17).

3

Techniques - 2

• Earlier parts have presented other widely

applicable techniques, such as

– divide-and-conquer,

– randomization, and– randomization, and

– the solution of recurrences.

4

Dynamic programming

• Dynamic programming typically applies tooptimization problems in which a set ofchoices must be made in order to arrive at anoptimal solution.

• Dynamic programming is effective when agiven subproblem may arise from more thanone partial set of choices; the key technique isto store the solution to each such subproblemin case it should reappear.

5

Greedy algorithms

• Like dynamic-programming algorithms, greedy

algorithms typically apply to optimization

problems in which a set of choices must be

made in order to arrive at an optimal solution.made in order to arrive at an optimal solution.

The idea of a greedy algorithm is to make each

choice in a locally optimal manner.

6

Dynamic programming -1

• Dynamic programming, like the divide-and-

conquer method, solves problems by

combining the solutions to subproblems.

• Divide and- conquer algorithms partition the• Divide and- conquer algorithms partition the

problem into independent subproblems, solve

the subproblems recursively, and then

combine their solutions to solve the original

problem.

7

Dynamic programming -2

• Dynamic programming is applicable when the

subproblems are not independent, that is,

when subproblems share subsubproblems.

• A dynamic-programming algorithm solves• A dynamic-programming algorithm solves

every subsubproblem just once and then

saves its answer in a table, thereby avoiding

the work of recomputing the answer every

time the subsubproblem is encountered.

8

Dynamic programming -2

• Dynamic programming is typically applied tooptimization problems. In such problemsthere can be many possible solutions. Eachsolution has a value, and we wish to find asolution with the optimal (minimum orsolution with the optimal (minimum ormaximum) value. We call such a solution an

optimal solution to the problem, as opposedto the optimal solution, since there may beseveral solutions that achieve the optimalvalue.

9

The development of a dynamic-

programming algorithm

• The development of a dynamic-programmingalgorithm can be broken into a sequence of foursteps.

1. Characterize the structure of an optimalsolution.solution.

2. Recursively define the value of an optimalsolution.

3. Compute the value of an optimal solutionin a bottom-up fashion.

4. Construct an optimal solution fromcomputed information.

10

Assembly-line scheduling

11

Step 1: The structure of the fastest way through the factory

12

Step 2: A recursive solution

13

Step 3: Computing the fastest times

14

Step 4: Constructing the fastest way

through the factory

15

Matrix-chain multiplication

16

We can multiply two matrices A and B only if they are compatible: the number

of columns of A must equal the number of rows of B. If A is a p × q matrix and B

is a q × r matrix, the resulting matrix C is a p × r matrix

Counting the number of

parenthesizations

17

• Step 1: The structure of an optimal

parenthesization

• Step 2: A recursive solution

• Step 3: Computing the optimal costs

18

Step 3: Computing the optimal costs

19

20

Step 4: Constructing an optimal

solution

21

top related