sorting
TRANSCRIPT
2
Bubble sort• Compare each element (except the last one) with its
neighbor to the right– If they are out of order, swap them– This puts the largest element at the very end– The last element is now in the correct and final place
• Compare each element (except the last two) with its neighbor to the right– If they are out of order, swap them– This puts the second largest element next to last– The last two elements are now in their correct and final places
• Compare each element (except the last three) with its neighbor to the right– Continue as above until you have no unsorted elements on the left
3
Example of bubble sort
7 2 8 5 4
2 7 8 5 4
2 7 8 5 4
2 7 5 8 4
2 7 5 4 8
2 7 5 4 8
2 5 7 4 8
2 5 4 7 8
2 7 5 4 8
2 5 4 7 8
2 4 5 7 8
2 5 4 7 8
2 4 5 7 8
2 4 5 7 8
(done)
4
Code for bubble sort• public static void bubbleSort(int[] a) {
int outer, inner; for (outer = a.length - 1; outer > 0; outer--) { // counting down for (inner = 0; inner < outer; inner++) { // bubbling up if (a[inner] > a[inner + 1]) { // if out of order... int temp = a[inner]; // ...then swap a[inner] = a[inner + 1]; a[inner + 1] = temp; } } }}
5
Analysis of bubble sort• for (outer = a.length - 1; outer > 0; outer--) {
for (inner = 0; inner < outer; inner++) { if (a[inner] > a[inner + 1]) { // code for swap omitted } }}
• Let n = a.length = size of the array
• The outer loop is executed n-1 times (call it n, that’s close enough)
• Each time the outer loop is executed, the inner loop is executed
– Inner loop executes n-1 times at first, linearly dropping to just once
– On average, inner loop executes about n/2 times for each execution of the outer loop
– In the inner loop, the comparison is always done (constant time), the swap might be done (also constant time)
• Result is n * n/2 + k, that is, O(n2/2 + k) = O(n2)
6
Loop invariants• You run a loop in order to change things• Oddly enough, what is usually most important in
understanding a loop is finding an invariant: that is, a condition that doesn’t change
• In bubble sort, we put the largest elements at the end, and once we put them there, we don’t move them again– The variable outer starts at the last index in the array and
decreases to 0– Our invariant is: Every element to the right of outer is in the
correct place– That is, for all j > outer, if i < j, then a[i] <= a[j]– When this is combined with outer == 0, we know that all
elements of the array are in the correct place
7
Selection sort• Given an array of length n,
– Search elements 0 through n-1 and select the smallest• Swap it with the element in location 0
– Search elements 1 through n-1 and select the smallest• Swap it with the element in location 1
– Search elements 2 through n-1 and select the smallest• Swap it with the element in location 2
– Search elements 3 through n-1 and select the smallest• Swap it with the element in location 3
– Continue in this fashion until there’s nothing left to search
8
Example and analysis of selection sort
• The selection sort might swap an array element with itself--this is harmless, and not worth checking for
• Analysis:– The outer loop executes n-1 times
– The inner loop executes about n/2 times on average (from n to 2 times)
– Work done in the inner loop is constant (swap two array elements)
– Time required is roughly (n-1)*(n/2)– You should recognize this as O(n2)
7 2 8 5 4
2 7 8 5 4
2 4 8 5 7
2 4 5 8 7
2 4 5 7 8
9
Code for selection sortpublic static void selectionSort(int[] a) {
int outer, inner, min; for (outer = 0; outer < a.length - 1; outer++) { // outer counts down min = outer; for (inner = outer + 1; inner < a.length; inner++) { if (a[inner] < a[min]) { min = inner; }} // a[min] is least among a[outer]..a[a.length - 1] int temp = a[outer]; a[outer] = a[min]; a[min] = temp; // Invariant: for all i <= outer, if i < j then a[i] <= a[j] }}
10
Insertion sort
3 4 7 121414202133381055 9 232816
sorted next to be inserted
3 4 7 55 9 232816
10temp
3833212014141210
sorted
less than 10
while some elements unsorted:Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted Move all the elements after the insertion location up one position to make space for the new element
12
Insertion Sort Analysis
insertionSort(vector<int> &a) {
int j;
for ( int p = 1; p < a.size(); ++p ) {
int tmp = a[p]; for (j=p; j > 0 && tmp < a[j-1]; j--) /* compare */
a[j] = a[j-1]; /*move */a[j] = tmp; /* insert */
}}
13
Analysis of insertion sort• We run once through the outer loop, inserting each
of n elements; this is a factor of n• On average, there are n/2 elements already sorted
– The inner loop looks at (and moves) half of these
– This gives a second factor of n/4
• Hence, the time required for an insertion sort of an array of n elements is proportional to n2/4
• Discarding constants, we find that insertion sort is O(n2)
14
Summary• Bubble sort, selection sort, and insertion sort are all O(n2)• we can do much better than this with somewhat more
complicated sorting algorithms
• Within O(n2), – Bubble sort is very slow, and should probably never be used for
anything
– Selection sort is intermediate in speed
– Insertion sort is usually the fastest of the three--in fact, for small arrays (say, 10 or 15 elements), insertion sort is faster than more complicated sorting algorithms
• Selection sort and insertion sort are “good enough” for small arrays
Comp 122
Divide and Conquer• Recursive in structure
– Divide the problem into sub-problems that are similar to the original but smaller in size
– Conquer the sub-problems by solving them recursively. If they are small enough, just solve them in a straightforward manner.
– Combine the solutions to create a solution to the original problem
Quicksort I
• To sort a[left...right]:1. if left < right:
1.1. Partition a[left...right] such that: all a[left...p-1] are less than a[p], and
all a[p+1...right] are >= a[p]1.2. Quicksort a[left...p-1]1.3. Quicksort a[p+1...right]
2. Terminate
Partitioning (Quicksort II)• A key step in the Quicksort algorithm is
partitioning the array– We choose some (any) number p in the array to use as
a pivot– We partition the array into three parts:
p
numbers less than p
numbers greater than or equal to p
p
Partitioning II
• Choose an array value (say, the first) to use as the pivot
• Starting from the left end, find the first element that is greater than or equal to the pivot
• Searching backward from the right end, find the first element that is less than the pivot
• Interchange (swap) these two elements• Repeat, searching from where we left off, until
done
Partitioning
• To partition a[left...right]:1. Set p = a[left], l = left + 1, r = right;2. while l < r, do
2.1. while l < right && a[l] < p { l = l + 1 }2.2. while r > left && a[r] >= p { r = r – 1}2.3. if l < r { swap a[l] and a[r] }
3. a[left] = a[r]; a[r] = p;4. Terminate
Example of partitioning
• choose pivot: 4 3 6 9 2 4 3 1 2 1 8 9 3 5 6• search: 4 3 6 9 2 4 3 1 2 1 8 9 3 5 6• swap: 4 3 3 9 2 4 3 1 2 1 8 9 6 5 6• search: 4 3 3 9 2 4 3 1 2 1 8 9 6 5 6• swap: 4 3 3 1 2 4 3 1 2 9 8 9 6 5 6• search: 4 3 3 1 2 4 3 1 2 9 8 9 6 5 6• swap: 4 3 3 1 2 2 3 1 4 9 8 9 6 5 6• search: 4 3 3 1 2 2 3 1 4 9 8 9 6 5 6 (left > right)
• swap with pivot: 1 3 3 1 2 2 3 4 4 9 8 9 6 5 6
Best case Complexity
• We cut the array size in half each time
• So the depth of the recursion is log2n
• At each level of the recursion, all the partitions at that level do work that is linear in n
• O(log2n) * O(n) = O(n log2n)
Merge SortSorting Problem: Sort a sequence of n elements into
non-decreasing order.
• Divide: Divide the n-element sequence to be sorted into two subsequences of n/2 elements each
• Conquer: Sort the two subsequences recursively using merge sort.
• Combine: Merge the two sorted subsequences to produce the sorted answer.
25
Merge Sort: Idea
Merge
Recursively sort
Divide intotwo halves
FirstPart SecondPart
FirstPart SecondPart
A:
A is sorted!
26
Merge Sort: AlgorithmMerge-Sort (A, n)
if n=1 return else
n1 ← n2 ← n/2
create array L[n1], R[n2]
for i ← 0 to n1-1 do L[i] ← A[i]
for j ← 0 to n2-1 do R[j] ← A[n1+j]
Merge-Sort(L, n1)
Merge-Sort(R, n2)
Merge(A, L, n1, R, n2 )
Space: n
Recursive Call
Time: n
28
L:L: R:R:
Merge-Sort: Merge Example
1 2 6 8 3 4 5 7
A:A:
Keep track of smallest element in each sorted half.
Insert smallest of two elements into auxiliary array.
Repeat until done
29
Merge-Sort: Merge Example
3 5 15 28 10 14
L:L:
A:A:
3 15 28 30 6 10 14 22
R:R:
i=0 j=0
k=0
1 2 6 8 3 4 5 7
1
30
Merge-Sort: Merge Example
1 5 15 28 30 6 10 14
L:L:
A:A:
3 5 15 28 6 10 14 22
R:R:
i=1 j=0
k=1
1 2 6 8 3 4 5 7
2
31
Merge-Sort: Merge Example
1 2 15 28 30 6 10 14
L:L:
A:A:
6 10 14 22
R:R:
i=2 j=0
k=2
1 2 6 8 3 4 5 7
3
33
Merge-Sort: Merge Example
1 2 3 4 6 10 14
L:L:
A:A:
6 10 14 22
R:R:
j=2
k=4
1 2 6 8 3 4 5 7
i=2
5
34
Merge-Sort: Merge Example
1 2 3 4 5 6 10 14
L:L:
A:A:
6 10 14 22
R:R:
i=2 j=3
k=5
1 2 6 8 3 4 5 7
6
36
Merge-Sort: Merge Example
1 2 3 4 5 5 7 14
L:L:
A:A:
3 5 15 28 6 10 14 22
R:R:1 2 6 8 3 4 5 7
8
i=3 j=4
k=7
37
Merge-Sort: Merge Example
1 2 3 4 5 6 7 8
L:L:
A:A:
3 5 15 28 6 10 14 22
R:R:1 2 6 8 3 4 5 7
i=4 j=4
k=8
38
merge(A,L,n1,R,n2)i ← j ← 0for k ← 0 to n1+n2-1
if i < n1
if j = n2 or L[i] ≤ R[j]A[k] ← L[i]i ← i + 1
else if j < n2
A[k] ← R[j]j ← j + 1
Why study Heapsort?
• Heapsort is always O(n log n)– Quicksort is usually O(n log n) but in the worst
case slows to O(n2)– Quicksort is generally faster, but Heapsort is
better in time-critical applications
• Heapsort is a really cool algorithm!
What is a “heap”?• Definitions of heap:
1. A large area of memory from which the programmer can allocate blocks as needed, and deallocate them (or allow them to be garbage collected) when no longer needed
2. A balanced, left-justified binary tree in which no node has a value greater than the value in its parent
• These two definitions have little in common
• Heapsort uses the second definition
Balanced binary trees• Recall:
– The depth of a node is its distance from the root
– The depth of a tree is the depth of the deepest node
• A binary tree of depth n is balanced if all the nodes at depths 0 through n-2 have two children
Balanced Balanced Not balanced
n-2n-1n
Left-justified binary trees• A balanced binary tree is left-justified if:
– all the leaves are at the same depth, or– all the leaves at depth n+1 are to the left of all
the nodes at depth n
Left-justified
Plan of attack• First, we will learn how to turn a binary tree into a
heap• Next, we will learn how to turn a binary tree back
into a heap after it has been changed in a certain way
• Finally we will see how to use these ideas to sort an array
The heap property• A node has the heap property if the value in the
node is as large as or larger than the values in its children
• All leaf nodes automatically have the heap property• A binary tree is a heap if all nodes in it have the
heap property
12
8 3
Blue node has heap property
12
8 12
Blue node has heap property
12
8 14
Blue node does not have heap property
siftUp• Given a node that does not have the heap property, you can
give it the heap property by exchanging its value with the value of the larger child
• This is sometimes called sifting up
• Notice that the child may have lost the heap property
14
8 12
Blue node has heap property
12
8 14
Blue node does not have heap property
Constructing a heap
• A tree consisting of a single node is automatically a heap
• We construct a heap by adding nodes one at a time:
• Each time we add a node, we may destroy the heap property of its parent node
• To fix this, we sift up
• But each time we sift up, the value of the topmost node in the sift may increase, and this may destroy the heap property of its parent node
• We repeat the sifting up process, moving up in the tree, until either
– We reach nodes whose values don’t need to be swapped (because the parent is still larger than both children), or
– We reach the root
Other children are not affected
• The node containing 8 is not affected because its parent gets larger, not smaller
• The node containing 5 is not affected because its parent gets larger, not smaller
• The node containing 8 is still not affected because, although its parent got smaller, its parent is still greater than it was originally
12
10 5
8 14
12
14 5
8 10
14
12 5
8 10
A sample heap• Here’s a sample binary tree after it has been heapified
• Notice that heapified does not mean sorted
• Heapifying does not change the shape of the binary tree; this binary tree is balanced and left-justified because it started out that way
19
1418
22
321
14
119
15
25
1722
Removing the root• Notice that the largest number is now in the root
• Suppose we discard the root:
• How can we fix the binary tree so it is once again balanced and left-justified?
19
1418
22
321
14
119
15
1722
11
The reHeap method I• Our tree is balanced and left-justified, but no longer a heap
• However, only the root lacks the heap property
• We can siftUp() the root
• After doing this, one and only one of its children may have lost the heap property
19
1418
22
321
14
9
15
1722
11
The reHeap method II• Now the left child of the root (still the number 11) lacks
the heap property
• We can siftUp() this node
• After doing this, one and only one of its children may have lost the heap property
19
1418
22
321
14
9
15
1711
22
The reHeap method III• Now the right child of the left child of the root (still the
number 11) lacks the heap property:
• We can siftUp() this node
• After doing this, one and only one of its children may have lost the heap property —but it doesn’t, because it’s a leaf
19
1418
11
321
14
9
15
1722
22
The reHeap method IV• Our tree is once again a heap, because every node in it has
the heap property
• Once again, the largest (or a largest) value is in the root
• We can repeat this process until the tree becomes empty
• This produces a sequence of values in order largest to smallest
19
1418
21
311
14
9
15
1722
22
Sorting• What do heaps have to do with sorting an array?• Here’s the neat part:
– Because the binary tree is balanced and left justified, it can be represented as an array
– All our operations on binary trees can be represented as operations on arrays
– To sort: heapify the array; while the array isn’t empty { remove and replace the root; reheap the new root node;
}
Mapping into an array
• Notice:– The left child of index i is at index 2*i+1
– The right child of index i is at index 2*i+2
– Example: the children of node 3 (19) are 7 (18) and 8 (14)
19
1418
22
321
14
119
15
25
1722
25 22 17 19 22 14 15 18 14 21 3 9 11
0 1 2 3 4 5 6 7 8 9 10 11 12
Removing and replacing the root• The “root” is the first element in the array• The “rightmost node at the deepest level” is the last element• Swap them...
• ...And pretend that the last element in the array no longer exists—that is, the “last index” is 11 (9)
25 22 17 19 22 14 15 18 14 21 3 9 11
0 1 2 3 4 5 6 7 8 9 10 11 12
11 22 17 19 22 14 15 18 14 21 3 9 25
0 1 2 3 4 5 6 7 8 9 10 11 12
Reheap and repeat• Reheap the root node (index 0, containing 11)...
• ...And again, remove and replace the root node
• Remember, though, that the “last” array index is changed
• Repeat until the last becomes first, and the array is sorted!
22 22 17 19 21 14 15 18 14 11 3 9 25
0 1 2 3 4 5 6 7 8 9 10 11 12
9 22 17 19 22 14 15 18 14 21 3 22 25
0 1 2 3 4 5 6 7 8 9 10 11 12
11 22 17 19 22 14 15 18 14 21 3 9 25
0 1 2 3 4 5 6 7 8 9 10 11 12
Linear time sorting• Can we do better (linear time algorithm) if the
input has special structure (e.g., uniformly distributed, every numbers can be represented by d digits)? Yes.
• Counting sort, radix sort
Counting Sort• Assume N integers to be sorted, each is in the range 1 to M.
• Define an array B[1..M], initialize all to 0 O(M)
• Scan through the input list A[i], insert A[i] into B[A[i]] O(N)
• Scan B once, read out the nonzero integers O(M)
Total time: O(M + N)
– if M is O(N), then total time is O(N)
– Can be bad if range is very big, e.g. M=O(N2)
N=7, M = 9,
Want to sort 8 1 9 5 2 6 3
1 2 5 8 9
Output: 1 2 3 5 6 8 9
3 6
Counting sort• What if we have duplicates?• B is an array of pointers.• Each position in the array has 2 pointers: head and
tail. Tail points to the end of a linked list, and head points to the beginning.
• A[j] is inserted at the end of the list B[A[j]]• Again, Array B is sequentially traversed and each
nonempty list is printed out.• Time: O(M + N)
Radix Sort• Extra information: every integer can be
represented by at most k digits– d1d2…dk where di are digits in base r
– d1: most significant digit
– dk: least significant digit
Radix Sort• Algorithm
– sort by the least significant digit first (counting sort)
=> Numbers with the same digit go to same bin
– reorder all the numbers: the numbers in bin 0 precede the numbers in bin 1, which precede the numbers in bin 2, and so on
– sort by the next least significant digit
– continue this process until the numbers have been sorted on all k digits