comparison of optimization algorithms

22
Comparison of Optimization Algorithms By Jonathan Lutu

Upload: lis

Post on 23-Feb-2016

88 views

Category:

Documents


0 download

DESCRIPTION

Comparison of Optimization Algorithms. By Jonathan Lutu. Bubble sort ( Exchanging). - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Comparison of Optimization Algorithms

Comparison of Optimization Algorithms

By Jonathan Lutu

Page 2: Comparison of Optimization Algorithms

Bubble sort (Exchanging)

• Bubble sort is a simple sorting algorithm. The algorithm starts at the beginning of the data set. It compares the first two elements, and if the first is greater than the second, it swaps them. It continues doing this for each pair of adjacent elements to the end of the data set. It then starts again with the first two elements, repeating until no swaps have occurred on the last pass. This algorithm's average and worst case performance is O(n2), so it is rarely used to sort large, unordered, data sets.

Page 3: Comparison of Optimization Algorithms

algorithm

bubble Sort( A : list of sortable items ) repeat swapped = false for i = 1 to length(A) - 1 inclusive do: /* if this pair is out of order */ if A[i-1] > A[i] then /* swap them and remember something changed */ swap( A[i-1], A[i] ) swapped = true end if end for until not swapped end procedure

Page 4: Comparison of Optimization Algorithms

Selection sort (Selection)• Selection sort is an in-place comparison sort. It has O(n2)

complexity, making it inefficient on large lists, and generally performs worse than the similar insertion sort. Selection sort is noted for its simplicity, and also has performance advantages over more complicated algorithms in certain situations.

• The algorithm finds the minimum value, swaps it with the value in the first position, and repeats these steps for the remainder of the list. It does no more than n swaps, and thus is useful where swapping is very expensive.

Page 5: Comparison of Optimization Algorithms

algorithm

• For I = 0 to N-1 do: Smallsub = I For J = I + 1 to N-1 do: If A(J) < A(Smallsub) Smallsub = J End-If End-For Temp = A(I) A(I) = A(Smallsub) A(Smallsub) = Temp End-For

Page 6: Comparison of Optimization Algorithms

Insertion sort(Insertion)

• Insertion sort is a simple sorting algorithm that is relatively efficient for small lists and mostly sorted lists, and often is used as part of more sophisticated algorithms. It works by taking elements from the list one by one and inserting them in their correct position into a new sorted list. In arrays, the new list and the remaining elements can share the array's space, but insertion is expensive, requiring shifting all following elements over by one. Shell sort (see below) is a variant of insertion sort that is more efficient for larger lists.

Page 7: Comparison of Optimization Algorithms

algorithm

for j <- 2 to length[A]do key <- A[j]Insert A[j] into the sorted sequence A[1 . . j - 1].i <- j – 1while i > 0 and A[i] > keydo A[i + 1] <- A[i]i <- i – 1A[i + 1] <- key

Page 8: Comparison of Optimization Algorithms

Shell sort (Insertion)

• Shell sort was invented by Donald Shell in 1959. It improves upon bubble sort and insertion sort by moving out of order elements more than one position at a time. One implementation can be described as arranging the data sequence in a two-dimensional array and then sorting the columns of the array using insertion sort.

Page 9: Comparison of Optimization Algorithms

algorithmvoid shellsort (int[] a, int n) { int i, j, k, h, v; int[] cols = {1391376, 463792, 198768, 86961, 33936, 13776, 4592, 1968, 861, 336, 112, 48, 21, 7, 3, 1} for (k=0; k<16; k++)

{ h=cols[k]; for (i=h; i<n; i++)

{ v=a[i]; j=i; while (j>=h && a[j-h]>v) { a[j]=a[j-h]; j=j-h; } a[j]=v; } } }

Page 10: Comparison of Optimization Algorithms

Merge sort (Merging)

• Merge sort takes advantage of the ease of merging already sorted lists into a new sorted list. It starts by comparing every two elements and swapping them if the first should come after the second. It then merges each of the resulting lists of two into lists of four, then merges those lists of four, and so on; until at last two lists are merged into the final sorted list.

Page 11: Comparison of Optimization Algorithms

algorithm

Input: array a[] indexed from 0 to n-1. m = 1 while m < n do i = 0 while i < n-m do merge subarrays a[i..i+m-1] and a[i+m .. min(i+2*m-1,n-1)] in-place. i = i + 2 * m m = m * 2

Page 12: Comparison of Optimization Algorithms

Quicksort(Partitioning)

• Quicksort is a divide and conquer algorithm which relies on a partition operation: to partition an array an element called a pivot is selected. All elements smaller than the pivot are moved before it and all greater elements are moved after it. This can be done efficiently in linear time and in-place. The lesser and greater sublists are then recursively sorted. Efficient implementations of quicksort (with in-place partitioning) are typically unstable sorts and somewhat complex, but are among the fastest sorting algorithms in practice.

Page 13: Comparison of Optimization Algorithms

algorithm

• Heapsort(A as array) BuildHeap(A) for i = n to 1 swap(A[1], A[i]) n = n - 1 Heapify(A, 1) BuildHeap(A as array) n = elements_in(A) for i = floor(n/2) to 1 Heapify(A,i) Heapify(A as array, i as int) left = 2i right = 2i+1 if (left <= n) and (A[left] > A[i]) max = left else max = i if (right<=n) and (A[right] > A[max]) max = right if (max != i) swap(A[i], A[max]) Heapify(A, max)

Page 14: Comparison of Optimization Algorithms

Counting sort (Non-comparison)

• Counting sort is applicable when each input is known to belong to a particular set, S, of possibilities. The algorithm runs in O(|S| + n) time and O(|S|) memory where n is the length of the input. It works by creating an integer array of size |S| and using the ith bin to count the occurrences of the ith member of S in the input. Each input is then counted by incrementing the value of its corresponding bin. Afterward, the counting array is looped through to arrange all of the inputs in order.

Page 15: Comparison of Optimization Algorithms

Algorithm

• for each input item x: Count[key(x)] = Count[key(x)] + 1 total = 0 for i = 0, 1, ... k: c = Count[i] Count[i] = total total = total + c ''' allocate an output array Output[0..n-1] ; THEN ''' for each input item x: store x in Output[Count[key(x)]] Count[key(x)] = Count[key(x)] + 1 return Output

Page 16: Comparison of Optimization Algorithms

Dijkstra's algorithm

• This algorithm is often used in routing and as a subroutine in other graph algorithms.

• This algorithm takes a lot of resources and can take a lot of time depending on what type of map.

Page 17: Comparison of Optimization Algorithms

DATA mining

• Classification algorithms- predict one or more discrete variables, based on the other attributes in the dataset.

• Regression algorithms• Segmentation algorithms• Association algorithms• Sequence analysis algorithms

Page 18: Comparison of Optimization Algorithms

Regression algorithm

predict one or more continuous variables, such as profit or loss, based on other attributes in the dataset.

Page 19: Comparison of Optimization Algorithms

Segmentation algorithm

• divide data into groups, or clusters, of items that have similar properties.

Page 20: Comparison of Optimization Algorithms

Association algorithm• find correlations between different attributes

in a dataset. The most common application of this kind of algorithm is for creating association rules, which can be used in a market basket analysis.

Page 21: Comparison of Optimization Algorithms

Sequence analysis algorithm

• summarize frequent sequences or episodes in data, such as a Web path flow.

Page 22: Comparison of Optimization Algorithms

Taravling Sales man

http://www.heatonresearch.com/fun/tsp/genetic