程序代写代做 algorithm data structure CE204

CE204
Data Structures and Algorithms
Part 5
18/02/2019 CE204 Part 5
1

Divide and Conquer
The principle of divide and conquer states that a large problem should be broken down into two or more smaller problems to be solved separately before combining their solutions.
The smaller problems could be distinct or could just be smaller versions of the original problem.
If the smaller problems are still relatively large we can break them down into even smaller problems.
18/02/2019 CE204 Part 5
2

Sorting by Divide and Conquer 1
We can apply the principle of divide and conquer to the problem of sorting a collection of objects.
If the collection is small, sorting can be performed trivially.
Otherwise we split the collection into two smaller collections and sort these separately. This will produce two sorted lists which must then be merged to produce the final result.
We express the divide and conquer approach to sorting a list in pseudo-code on the next slide.
18/02/2019 CE204 Part 5
3

Sorting by Divide and Conquer 2
static List sort(List l) { if (l is small)
{ sort l by simple method return sorted list
}
else
{ split l into two smaller lists l1 and l2
List s1 = sort(l1); List s2 = sort(l2); merge s1 and s2 return merged list
} }
18/02/2019 CE204 Part 5
4

Sorting by Divide and Conquer 3
A number of different sorting algorithms can be obtained by using different techniques for splitting the list into two smaller lists.
In the easy-split hard-merge approach we make the splitting as simple as possible – most of the work will then be done in merging the sorted lists.
In the hard-split easy-merge approach we split the list into small elements and large elements so that every object in the sorted list s1 will be smaller than any object in s2 – merging then simply involves the concatenation of the two sorted lists.
18/02/2019 CE204 Part 5
5

Sorting by Divide and Conquer 4
In both approaches we can choose to split a list containing n elements into two smaller lists of (approximately) equal size or one list of size n-1 and one list of size 1.
The former approach will give O(n log n) time as long as the splitting and merging can be done in O(n) time.
The latter approach has the advantage that the sorting of the list of size 1 is trivial so only one recursive call is needed (and it will be possible to rewrite the programs using a loop instead of recursion).
18/02/2019 CE204 Part 5
6

Insertion Sort 1
The first algorithm we consider uses the easy-split hard-merge approach with smaller lists of size 1 and n-1. After sorting the list of size n-1, the merging involves inserting the single element from the list of size 1 into the sorted list in the correct position.
We present the code in a recursive style as a generic method with an argument of type List – the type T must implement the Comparable interface so we need to use a bounded type parameter.
If the list to be sorted is large it would be more efficient to create a list of appropriate capacity for the result before calling the method, and supply this as an extra argument.
18/02/2019 CE204 Part 5
7

Insertion Sort 2
public static >
List iSort(List l)
{ if (l.size()<2) return new ArrayList(l);
else
{ List s = iSort(l.subList(1, l.size()));
int n = 0;
while (n0)
s.add(n, l.get(0)); // add at location n
return s; }
}
18/02/2019 CE204 Part 5
8

Insertion Sort 3
Note that although we decided to return an object of type ArrayList, the argument type had to be List since the argument for the recursive call is the result of a call to the subList method and the return type of this method is List.
We could have chosen to write a method that returns a linked list; however since the time complexity of the get method for linked lists is O(n) rather than O(1), simply replacing ArrayList by LinkedList would result in the time for the while loop being O(n2) instead of O(n). Additionally the subList method is more efficient when applied to an ArrayList object.
18/02/2019 CE204 Part 5
9

Insertion Sort 4
The average number of iterations for the while loop for a list of size n will be about n/2 and in the worst case the number will be n. Since the time for the loop body is O(1) we can conclude that the time for the loop is O(n). The addition of the element to the list will also take O(n) time in the average and worst cases.
We hence obtain the recurrence relation T(n) ≤ T1+T(n-1) where T1 is O(n) so we can conclude that the average and worst case running times for insertion sort are O(n2).
With the code presented here the time in the best case will also be O(n2), but with a linked list version (efficiently coded using list cells instead of the LinkedList class) if all insertions were near the beginning of the list we would get O(n) performance.
18/02/2019 CE204 Part 5
10

Mergesort 1
The second sorting algorithm also uses an easy-split hard-merge approach, but splits the list into two sub-lists of equal size (with sizes differing by 1 if the number of items in the list is odd). The merging becomes more complicated since we have to iterate through two sorted lists simultaneously, comparing the next element in each list and appending the smaller to the result list. When all the elements from one of the lists have been added it is still necessary to add the remaining elements from the other list.
We present an implementation using list iterators to traverse the sorted lists, making use of the previous method to allow the element not copied after each comparison to be re-examined.
18/02/2019 CE204 Part 5
11

Mergesort 2
public static >
List mSort(List l)
{ if (l.size()<2) return new ArrayList(l);
else
{ int n = l.size();
List s1 = mSort(l.subList(0, n/2)); List s2 = mSort(l.subList(n/2, n)); List result = new ArrayList(n);
// allocate enough space for result ListIterator it1 = s1.listIterator(); ListIterator it2 = s2.listIterator();
// continued on next slide
18/02/2019 CE204 Part 5
12

Mergesort 3
// mSort method continued
while (it1.hasNext() && it2.hasNext())
{ T t1 = it1.next();
T t2 = it2.next();
if (t1.compareTo(t2)<0) { result.add(t1); it2.previous(); } else { result.add(t2); it1.previous(); } } // continued on next slide // adds to end 18/02/2019 CE204 Part 5 13 Mergesort 4 // mSort method continued // have now copied all of one list to result // but there will still be at least one // element to copy from the other list if (it1.hasNext()) do result.add(it1.next()); while (it1.hasNext()); else do result.add(it2.next()); while (it2.hasNext()); return result; } } 18/02/2019 CE204 Part 5 14 Mergesort 5 Each iteration around a loop during merging will add an element to the result array so the total number of iterations for the two loops will be n. As adding to the end of a list with sufficient capacity is O(1), the loop bodies have O(1) time, so the total time for merging is O(n) in all cases. The time for splitting will be O(1) or O(n) depending on the type of the list supplied as an argument. In all cases we get a recurrence relation of the form T(n) ≤ T1+T(n/2)+T(n/2+1) for odd n and T(n) ≤ T1+2T(n/2) for even n, where T1 is O(n). Hence we can conclude that the running time for mergesort is O(n log n) in all cases. Hence for large lists mergesort will perform more efficiently that insertion sort in most cases. 18/02/2019 CE204 Part 5 15 Selection Sort 1 We now consider algorithms that use the hard-split easy-merge approach. The first algorithm splits a list of size n into smaller lists of size 1 and n-1. The list of size 1 must contain either the smallest or largest element in the collection so this must be located by searching the list. To avoid having to remove items from the middle of a list, the normal approach when implementing this algorithm is to swap the smallest (or largest) element with the first element so that the sub-list starting at the location 1 will contain all of the remaining items. This swap can be performed most efficiently if an ArrayList or array is used. 18/02/2019 CE204 Part 5 16 Selection Sort 2 For efficient performance it is preferable to modify the contents of the list or array supplied as an argument instead of creating a new list for the result. If the smallest item has been placed in location 0 of the array or list there is nothing more to do after sorting the rest, so merging takes no time. We present an implementation using an array; to avoid having to copy part of the array to supply as an argument to the recursive call we use a second argument that indicates the starting position of the portion of the array to be sorted. 18/02/2019 CE204 Part 5 17 Selection Sort 3 static >
void sSort(T[] l, int start)
{ if (start> void selSort(T[] l)
{ if (l.length>1)
sSort(l, 0);
}
For an array portion of size n the loop body, which is O(1), will be executed n-1 times so the time for the loop is O(n). We obtain a recurrence relation of the form T(n) ≤ T1+T(n-1) where T1 is O(n) and can conclude that selection sort takes O(n2) time in all cases.
18/02/2019 CE204 Part 5
19

Selection Sort 5
The selection sort algorithm as presented uses tail recursion – the recursive call is the last statement in the method body. Any tail-recursive method can be easily converted into a non- recursive method that uses a while loop.
On the next slide we present a non-recursive version of selection sort; it is not difficult to see that the time for this version is also O(n2).
18/02/2019 CE204 Part 5
20

Selection Sort 6
public static >
void sSort(T[] l)
{ int start = 0;
while (start1)
qSort(l, 0, l.length-1); }
18/02/2019 CE204 Part 5
25

Quicksort 5
public static void qSort(
int[] l, int start, int end)
{ if (end>start)
{ int pivot = // select appropriate pivot
int lo = start, hi = end; do
{ while (l[lo]<=pivot) lo++; while (l[hi]>pivot) hi–;
if (lo