adam blank autumn 2015lecture 3 cse 332 · w(n), because we always must do n iterations of the...

33
Adam Blank Autumn 2015 Lecture 3 CSE 332 Data Abstractions

Upload: others

Post on 03-Dec-2019

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Adam Blank Autumn 2015Lecture 3

CSE332

Data Abstractions

Page 2: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

CSE 332: Data Abstractions

Algorithm Analysis 2T(n)

T(n/2)

T(n/4)

. . .

T(1)T(1)

. . .

T(1)T(1)

T(n/4)

. . .

T(1)T(1)

. . .

T(1)T(1)

T(n/2)

T(n/4)

. . .

T(1)T(1)

. . .

T(1)T(1)

T(n/4)

. . .

T(1)T(1)

. . .

T(1)T(1)

Page 3: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Outline

1 Warm-Ups

2 Analyzing Recursive Code

3 Generating and Solving Recurrences

Page 4: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Warm-Up #1: append 1

Let x and L be LinkedList Nodes.

Analyzing append1 append(x, L) 2 Node curr = L;3 while (curr != null && curr.next != null) 4 curr = curr.next;5 6 curr.next = x;7

What is the. . .best case time complexity of append?

Ω(n), because we always must do n iterations of the loop.

worst case time complexity of append?

O(n), because we never do more than n iterations of the loop.

Since we can upper and lower bound the time complexity with the samecomplexity class, we can say append runs in Θ(n).

Page 5: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Warm-Up #1: append 1

Let x and L be LinkedList Nodes.

Analyzing append1 append(x, L) 2 Node curr = L;3 while (curr != null && curr.next != null) 4 curr = curr.next;5 6 curr.next = x;7

What is the. . .best case time complexity of append?Ω(n), because we always must do n iterations of the loop.

worst case time complexity of append?

O(n), because we never do more than n iterations of the loop.

Since we can upper and lower bound the time complexity with the samecomplexity class, we can say append runs in Θ(n).

Page 6: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Warm-Up #1: append 1

Let x and L be LinkedList Nodes.

Analyzing append1 append(x, L) 2 Node curr = L;3 while (curr != null && curr.next != null) 4 curr = curr.next;5 6 curr.next = x;7

What is the. . .best case time complexity of append?Ω(n), because we always must do n iterations of the loop.

worst case time complexity of append?O(n), because we never do more than n iterations of the loop.

Since we can upper and lower bound the time complexity with the samecomplexity class, we can say append runs in Θ(n).

Page 7: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Warm-Up #1: append 1

Let x and L be LinkedList Nodes.

Analyzing append1 append(x, L) 2 Node curr = L;3 while (curr != null && curr.next != null) 4 curr = curr.next;5 6 curr.next = x;7

What is the. . .best case time complexity of append?Ω(n), because we always must do n iterations of the loop.

worst case time complexity of append?O(n), because we never do more than n iterations of the loop.

Since we can upper and lower bound the time complexity with the samecomplexity class, we can say append runs in Θ(n).

Page 8: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Merge 2

Pre-Condition: L1 and L2 are sorted.Post-Condition: Return value is sorted.Merge

1 merge(L1, L2) 2 p1, p2 = 0;3 While both lists have more elements:4 Append the smaller element to L.5 Increment p1 or p2, depending on which had the smaller element6 Append any remaining elements from L1 or L2 to L7 return L8

What is the. . . (remember the lists are Nodes)best case # of comparisons of merge?

Ω(1). Consider the input: [0], [1, 2, 3, 4, 5, 6].

worst case # of comparisons of merge?

O(n). Consider the input: [1, 3, 5], [2, 4, 6].

worst case space usage of merge?

O(n), because we allocate a constant amount of space per element.

Page 9: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Merge 2

Pre-Condition: L1 and L2 are sorted.Post-Condition: Return value is sorted.Merge

1 merge(L1, L2) 2 p1, p2 = 0;3 While both lists have more elements:4 Append the smaller element to L.5 Increment p1 or p2, depending on which had the smaller element6 Append any remaining elements from L1 or L2 to L7 return L8

What is the. . . (remember the lists are Nodes)best case # of comparisons of merge?Ω(1). Consider the input: [0], [1, 2, 3, 4, 5, 6].

worst case # of comparisons of merge?

O(n). Consider the input: [1, 3, 5], [2, 4, 6].

worst case space usage of merge?

O(n), because we allocate a constant amount of space per element.

Page 10: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Merge 2

Pre-Condition: L1 and L2 are sorted.Post-Condition: Return value is sorted.Merge

1 merge(L1, L2) 2 p1, p2 = 0;3 While both lists have more elements:4 Append the smaller element to L.5 Increment p1 or p2, depending on which had the smaller element6 Append any remaining elements from L1 or L2 to L7 return L8

What is the. . . (remember the lists are Nodes)best case # of comparisons of merge?Ω(1). Consider the input: [0], [1, 2, 3, 4, 5, 6].

worst case # of comparisons of merge?O(n). Consider the input: [1, 3, 5], [2, 4, 6].

worst case space usage of merge?

O(n), because we allocate a constant amount of space per element.

Page 11: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Merge 2

Pre-Condition: L1 and L2 are sorted.Post-Condition: Return value is sorted.Merge

1 merge(L1, L2) 2 p1, p2 = 0;3 While both lists have more elements:4 Append the smaller element to L.5 Increment p1 or p2, depending on which had the smaller element6 Append any remaining elements from L1 or L2 to L7 return L8

What is the. . . (remember the lists are Nodes)best case # of comparisons of merge?Ω(1). Consider the input: [0], [1, 2, 3, 4, 5, 6].

worst case # of comparisons of merge?O(n). Consider the input: [1, 3, 5], [2, 4, 6].

worst case space usage of merge?O(n), because we allocate a constant amount of space per element.

Page 12: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Well, we did merge, what did you think was next? 3

Consider the following code:

Merge Sort1 sort(L) 2 if (L.size() < 2) 3 return L;4 5 else 6 int mid = L.size() / 2;7 return merge(8 sort(L.subList(0, mid)),9 sort(L.subList(mid, L.size()))

10 );11 12

What is the worst case/best case # of comparisons of sort?

Yeah, yeah, it’s O(n lgn), but why?

Page 13: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Recurrences 4

What is a recurrence?

In CSE 311, you saw a bunch of questions like:

Induction ProblemLet f0 = 0, f1 = 1, fn = fn−1+ fn−2 for all n ≥ 2. Prove fn < 2n for all n ∈N.

(Remember the Fibonacci Numbers? You’d better bet they’re going toshow up in this course!)

That’s a recurrence. That’s it.

Definition (Recurrence)A recurrence is a recursive definition of a function in terms of smallervalues.

Page 14: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Merge Sort is hard; so. . . 5

Let’s start with trying to analyze this code:

LinkedList Reversal1 reverse(L) 2 if (L == null) 3 return null;4 5 else 6 Node front = L;7 Node rest = L.next;8 L.next = null;9

10 Node restReversed = reverse(rest);11 append(front, restReversed);12 13

Notice that append is the same function from the beginning of lecturethat had runtime O(n).

So, what is the time complexity of reverse?

We split the work into two pieces:Non-Recursive WorkRecursive Work

Page 15: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Non-Recursive Work 6

LinkedList Reversal1 reverse(L) 2 if (L == null) //O(1)3 return null;4 5 else 6 Node front = L; //O(1)7 Node rest = L.next; //O(1)8 L.next = null; //O(1)9

10 Node restReversed = reverse(rest);11 append(front, restReversed); //O(n)12 13

Non-Recursive Work: O(n), which means we can write it as c0+c1n forsome constants c0 and c1.

Page 16: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Non-Recursive Work 7

LinkedList Reversal1 reverse(L) 2 if (L == null) 3 return null;4 5 else 6 Node front = L;7 Node rest = L.next;8 L.next = null;9

10 Node restReversed = reverse(rest);11 append(front, restReversed);12 13

Non-Recursive Work: O(n), which means we can write it as c0+c1n forsome constants c0 and c1.Recursive Work: The work it takes to do reverse on a list one smaller.Putting these together almost gives us the recurrence:

T(n) = c0+c1n+T(n−1)

We’re missing the base case!

Page 17: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

reverse Recurrence 8

LinkedList Reversal1 reverse(L) 2 if (L == null) 3 return null;4 5 else 6 Node front = L;7 Node rest = L.next;8 L.next = null;9

10 Node restReversed = reverse(rest);11 append(front, restReversed);12 13

T(n) =⎧⎪⎪⎨⎪⎪⎩

d0 if n = 0c0+c1n+T(n−1) otherwise

Now, we need to solve the recurrence.

Page 18: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Solving the reverse Recurrence 9

T(n) =⎧⎪⎪⎨⎪⎪⎩

d0 if n = 0c0+c1n+T(n−1) otherwise

T(n) = (c0+c1n) + T(n−1)= (c0+c1n) + (c0+c1(n−1)) + T(n−2)= (c0+c1n) + (c0+c1(n−1)) + (c0+c1(n−2))+ ...+(c0+c1(1))+d0

=n−1

∑i=0(c0+c1(n− i)) + d0

=n−1

∑i=0

c0+n−1

∑i=0

c1(n− i) + d0

= nc0+c1

n

∑i=1

i + d0

= nc0+c1(n(n+1)

2) + d0

=O(n2)

Page 19: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Solving Linear Recurrences 10

A recurrence where we solve some constant piece of the problem (e.g.“-1”, “-2”, etc.) is called a Linear Recurrence.

We solve these like we did above by Unrolling the Recurrence.

This is a fancy way of saying “plug the definition into itself until apattern emerges”.

Now, back to mergesort.

Page 20: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Analyzing Merge Sort 11

Merge Sort1 sort(L) 2 if (L.size() < 2) 3 return L;4 5 else 6 int mid = L.size() / 2;7 return merge(8 sort(L.subList(0, mid)),9 sort(L.subList(mid, L.size()))

10 );11 12

First, we need to find the recurrence:

T(n) =⎧⎪⎪⎪⎪⎨⎪⎪⎪⎪⎩

d0 if n = 0d1 if n = 1c0+c1n+2T(n/2) otherwise

This recurrence isn’t linear! This is a “divide and conquer”recurrence.

Page 21: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Analyzing Merge Sort 12

T(n) =⎧⎪⎪⎪⎪⎨⎪⎪⎪⎪⎩

d0 if n = 0d1 if n = 1c0+c1n+2T(n/2) otherwise

This time, there are multiple possible approaches:

Unrolling the Recurrence

T(n) = (c2+c1n) + 2(c2+c1n+2T(n/4))= (c2+c1n) + 2(c2+c1n+2(c2+c1n+2T(n/8)))= c2+2c2 + 4c2+ ...+argh+ ...

This works, but I’d rarely recommend it.

Insight: We’re branching in this recurrence. So, represent it as a tree!

Page 22: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Merge Sort: Solving the Recurrence 13

T(n) =⎧⎪⎪⎪⎪⎨⎪⎪⎪⎪⎩

d0 if n = 0d1 if n = 1c0+c1n+2T(n/2) otherwise

T(n)

Page 23: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Merge Sort: Solving the Recurrence 14

T(n) =⎧⎪⎪⎪⎪⎨⎪⎪⎪⎪⎩

d0 if n = 0d1 if n = 1c0+c1n+2T(n/2) otherwise

n

T(n/2)T(n/2)

Page 24: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Merge Sort: Solving the Recurrence 15

T(n) =⎧⎪⎪⎪⎪⎨⎪⎪⎪⎪⎩

d0 if n = 0d1 if n = 1c0+c1n+2T(n/2) otherwise

n

n/2

T(n/4)T(n/4)

n/2

T(n/4)T(n/4)

Page 25: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Merge Sort: Solving the Recurrence 16

T(n) =⎧⎪⎪⎪⎪⎨⎪⎪⎪⎪⎩

d0 if n = 0d1 if n = 1c0+c1n+2T(n/2) otherwise

n

n/2

n/4

. . .

T(1)T(1)

. . .

T(1)T(1)

n/4

. . .

T(1)T(1)

. . .

T(1)T(1)

n/2

n/4

. . .

T(1)T(1)

. . .

T(1)T(1)

n/4

. . .

T(1)T(1)

. . .

T(1)T(1)

Page 26: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Merge Sort: Solving the Recurrence 17

T(n) =⎧⎪⎪⎪⎪⎨⎪⎪⎪⎪⎩

d0 if n = 0d1 if n = 1c0+c1n+2T(n/2) otherwise

lg(n)

n

n/2

n/4

. . .

d1d1

. . .

d1d1

n/4

. . .

d1d1

. . .

d1d1

n/2

n/4

. . .

d1d1

. . .

d1d1

n/4

. . .

d1d1

. . .

d1d1

n

n

n

n

d0n

Since the recursion tree has height lg(n) and each row does n work, itfollows that T(n) ∈O(n lg(n)).

Page 27: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

sum Examples #1 18

Find A Big-Oh Bound For The Worst Case Runtime1 sum(n) 2 if (n < 2) 3 return n;4 5 return 2 + sum(n − 2);6

T(n) =⎧⎪⎪⎪⎪⎨⎪⎪⎪⎪⎩

d0 if n = 0d0 if n = 1c0+T(n−2) otherwise

T(n) = c0+c0+⋅ ⋅ ⋅+c0+d0

= c0(n2)+d0

=O(n)

Page 28: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

sum Examples #2 19

Find A Big-Oh Bound For The Worst Case Runtime1 binarysearch(L, value) 2 if (L.size() == 0) 3 return false;4 5 else if (L.size() == 1) 6 return L[0] == value;7 8 else 9 int mid = L.size() / 2;

10 if (L[mid] < value) 11 return binarysearch(L.subList(mid + 1, L.size()), value);12 13 else 14 return binarysearch(L.subList(0, mid), value);15 16 17

T(n) =⎧⎪⎪⎪⎪⎨⎪⎪⎪⎪⎩

d0 if n = 0d1 if n = 1c0+T(n/2) otherwise

Page 29: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

sum Examples #2 20

T(n) =⎧⎪⎪⎪⎪⎨⎪⎪⎪⎪⎩

d0 if n = 0d1 if n = 1c0+T(n/2) otherwise

lg(n)

c0

c0

c0

. . .

d1

So, T(n) = c0(lg(n)−1)+d1 =O(lgn).

Page 30: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Some Common Series 21

Gauss’ Sum:n

∑i=0

i = n(n+1)2

Infinite Geometric Series:∞∑i=0

xi = 11−x

, when ∣x∣ < 1.

Finite Geometric Series:n

∑i=0

xi = 1−xn+1

1−x, when x ≠ 1.

Page 31: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Master Theorem 22

Consider a recurrence of the form:

T(n) =⎧⎪⎪⎨⎪⎪⎩

d if n = 1aT ( n

b)+nc otherwise

Then,If logb(a) < c, then T(n) =Θ(nc).If logb(a) = c, then T(n) =Θ(nc lg(n)).If logb(a) > c, then T(n) =Θ(nlogb(a)).

Sanity Check: For Merge Sort, we have a = 2,b = 2,c = 1. Then,log2(2) = 1 = 1. So, T(n) = n lgn.

Page 32: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Proving the First Case of Master Theorem 23

T(n) =⎧⎪⎪⎨⎪⎪⎩

d if n = 1aT ( n

b)+nc otherwise

We assume that logb(a) < c. Then, unrolling the recurrence, we get:

T(n) = nc+aT(n/b)= nc+a((n/b)c+aT(n/b2))= nc+a(n/b)c+a2(n/b2)c+⋅ ⋅ ⋅+alogb(n)(n/blogb n)c

=logb(n)∑i=0

ai( nc

bic )

= nclogb(n)∑i=0( a

bc )i

= nc⎛⎜⎝

( abc )

logb(n)+1−1

( abc )−1

⎞⎟⎠≈ nc(( a

bc )logb(n)

) ≈ nc

Page 33: Adam Blank Autumn 2015Lecture 3 CSE 332 · W(n), because we always must do n iterations of the loop. worst case time complexity of append? O(n), because we never do more than n iterations

Today’s Takeaways! 24

Know how to make a recurrence from a recursive program

Know the different types of recurrences

Be able to find a closed form for each type of recurrence

Know the common summations

Understand why Master Theorem can be helpful