程序代写代做代考 algorithm go *

*
Invented by Tony Hoare in the late 1950’s, the Quicksort algorithm was a breakthrough in sorting methods. Variants of Hoare’s original algorithm are still a sorting method of choice today. In this worksheet, you will gain experience with the divide and conquer algorithmic design approach, as well as the analysis of recurrence relationns, that led Hoare to this breakthrough. You will also apply this algorithm to the problem of 􏰆nding the median of a list of values.
Here is a basic version of Quicksort, when the array A[1..n] to be sorted has n distinct elements:
CPSC 320: Divide and Conquer Solutions
function Quicksort(A[1..n]) ◃ returns the sorted array A of n distinct numbers if n>1then
◃Θ(1)
◃ Θ(1)
◃ Θ(n)
◃ Θ(n)
Choose pivot element p = A[1]
Let Lesser be an array of all elements from A less than p
Let Greater be an array of all elements from A greater than p LesserSorted ← QuickSort(Lesser)
GreaterSorted ← QuickSort(Greater)
return the concatenation of LesserSorted, [p], and GreaterSorted
else
return A
◃ TQ (⌈ n ⌉ − 1) 4
◃ TQ(⌊3n⌋) 4
1
1.
Quicksort Runtime Analysis
Suppose that QuickSort happens to always select the ⌈n⌉-th smallest element as its pivot. Give a
4
recurrence relation for the runtime of QuickSort.
SOLUTION: Let TQ(n) be the runtime (number of steps) of QuickSort on an array of length n. For these solutions, we’ve annotated the code above with the runtime of each step. This leads us to the following recurrence. It’s convenient in recurrences to replace big-O or Θ terms with some constant, and it’s ok to use the same constant everywhere.
􏰍c, if n = 0 or n = 1 TQ(n)= TQ(⌈n⌉−1)+TQ(⌊3n⌋)+cn, otherwise.
Ignoring 􏰇oors, ceilings, and constants we get a slightly simpler recurrence: 􏰍c, if n = 0 or n = 1
TQ(n)= TQ(n)+TQ(3n)+cn, otherwise. 44
◃ O(1) ◃ Θ(1)
44
*
Copyright Notice: UBC retains the rights to this document. You may not distribute this document without permission.
1

2. Using the recurrence, draw a recursion tree for QuickSort. Label each node by the number of elements in the array at that node’s call (the root is labeled n) and the amount of time taken by that node but not its children. Also, label the total work (time) for each “level” of calls. Show the root at level 0 and the next two levels below the root, and also the node at the leftmost and rightmost branches of level i.
SOLUTION: We show the work done at each node (and not its children) in black, and the array size at that node in blue.
3. Find the following two quantities.
(a) The number of levels in the tree down to the shallowest leaf. Hint: Is the shallowest leaf on the leftmost side of the tree, the rightmost side, or somewhere else? If you’ve already described the problem size of the leftmost and rightmost nodes at level i as a function of i, then set that equal to the problem size you expect at the leaves and solve for i.
SOLUTION: The n branch will reach the base case fastest. If we set that equal to 1, we get 4i
n =1,orn=4i. Takinglogsonbothsides,log n=i. Alsolog n=2log n(recallthatmore 4i 442
generally, logb n = logc n/ logc b). So the number of levels to the shallowest leaf is lg n/2.
(b) The number of levels in the tree down to the deepest leaf.
SOLUTION: Similarly, the (3)in branch reaches the base case slowest, and by a similar analysis, 4
this really is a nice, normal log.)
we 􏰆nd i = log 4 n. That looks nasty but is only a constant factor away from log n. (Since 4 > 1, 343
2

4. Use the work from the previous parts to 􏰆nd asymptotic upper and lower bounds for the solution of your recurrence.
SOLUTION: For the lower bound, we perform cn work at each level up to the level of the 􏰆rst leaf, which has depth Ω(log n), so the total work done is Ω(n log n). For the upper bound, we perform at most cn work at each level, including levels after that of the 􏰆rst leaf. There are O(log n) levels in total, so the total work done is O(n log n). Putting the upper and lower bounds together we see that the total work done is Θ(n log n).
5. Now, we will relax our assumption that Quicksort always selects the ⌈n⌉-th smallest element as its 4
pivot. Instead, consider a weaker assumption that the rank of the pivot is always in the range between ⌈n⌉ and ⌊3n⌋. The rank of an element is k if the element is the kth smallest in the array. What can
44
you say about the runtime of Quicksort in this case?
SOLUTION: If the pivot always lies in the range between ⌈n⌉ and ⌊3n⌋, the shallowest leaf has
44
depth at least lg n, and the deepest leaf has depth at most log4/3 n. Also, the total work per level is
cn, up to the level of the shallowest leaf, and at most cn for the remaining levels. So our asymptotic upper and lower bounds still hold, and the running time is still Θ(n log n).
6. Draw the recursion tree generated by QuickSort([10, 3, 5, 18, 1000, 2, 100, 11, 14]). As- sume that QuickSort: (1) selects the 􏰆rst element as pivot and (2) maintains elements’ relative order when producing Lesser and Greater.
SOLUTION: Here it is with each node containing the array passed into that node’s call, the pivot in bold, and the number 11 in blue.
3

2 An Algorithm for Finding the Median
Suppose that you want to 􏰆nd the median number in an array of length n. The algorithm below can be generalized to 􏰆nd the element of rank k, i.e., the kth smallest element in an array, for any 1 ≤ k ≤ n. (Note: the larger the rank, the larger the element.) The median is the element of rank ⌈n/2⌉. For example, the median in the array [10, 3, 5, 18, 1000, 2, 100, 11, 14] is the element of rank 5 (or 5th smallest element), namely 11.
1. In your speci􏰆c recursion tree above, mark the nodes in which the median (11) appears. (The 􏰆rst of these is the root.)
SOLUTION: These are the ones with the blue 11 in them above.
2. Look at the third recursive call you marked. What is the rank of 11 in this array? How does this
relate to 11’s rank in the second recursive call, and why?
SOLUTION: Note that we’re looking at the node containing [11, 14]. This is the Lesser array of its parent, and the rank of 11 is 1. The rank did not change because to obtain the Lesser array from its parent array, only elements larger than 11 were removed, namely the pivot and the two elements in Greater.
3. Now, look at the second recursive call you marked􏰄the one below the root. 11 is not the median of the array in that recursive call!
SOLUTION:
(a) In this array, what is the median? 18
(b) In this array, what is the rank of the median? The element 18 has rank 3, i.e., 18 is the 3rd smallest element.
(c) In this array, what is the rank of 11? The element 11 has rank 1; it is the smallest element.
(d) How does the rank of 11 in this array relate to the rank of 11 in the original array (at the root)? Why does this relationship hold?
Note that we’re looking at the node containing [18, 1000, 100, 11, 14]. The rank of 11 has changed: it is 1 here, but it was 5 at the root. Why? 11 ended up on the right side (in Greater). So, everything that was smaller than 11 in the original array (at the root) is no longer in the array [18, 1000, 100, 11, 14]. We discarded four elements (3, 5, 2 and the median 10) from the array at the root of the tree, so 11’s rank has gone down by four.
4. If you’re looking for the element of rank 42 (i.e., the 42nd smallest element) in an array of 100 elements, and Lesser has 41 elements, where is the element you’re looking for?
SOLUTION: If Lesser has 41 elements, then there are 41 elements smaller than the pivot. That makes the pivot the 42nd smallest element. In other words, if the size of Lesser is k − 1, then the pivot has rank k. (This is what happened􏰄with much smaller numbers􏰄in the node where 11 is also the pivot.)
5. How could you determine before making QuickSort’s recursive calls whether the element of rank k is the pivot or appears in Lesser or Greater?
SOLUTION: Putting together what we saw above:
􏰀 If |Lesser| is equal to k − 1, then the k-th smallest element is the pivot.
4

􏰀 If |Lesser| is larger than k − 1, then the pivot is larger than the k-th smallest element, which puts it in the Lesser group (and the left recursive call).
􏰀 If |Lesser| is smaller than k − 1, then the k-th smallest element is in Greater (and the right recursive call).
6. Modify the QuickSort algorithm so that it 􏰆nds the element of rank k. Just cross out or change parts of the code below as you see 􏰆t. Change the function’s name! Add a parameter! Feel the power!
SOLUTION: The key di􏰅erence is that we no longer need the recursive calls on both sides, only on the side with the element we seek.
function QuickSelect(A[1..n], k) // returns the element of rank k in an array A of n distinct numbers, where 1 ≤ k ≤ n
if n>1then
Choose pivot element p = A[1]
Let Lesser be an array of all elements from A less than p
Let Greater be an array of all elements from A greater than p if |Lesser| = k – 1 then
return p else
if |Lesser| > k – 1 then // all smaller elts are in Lesser; k is unchanged return QuickSelect(Lesser, k)
else// |Lesser| < k − 1 // subtract from k the number of smaller elts removed return QuickSelect(Greater, k− |Lesser| −1) else return A[1] 7. Once again, suppose that the rank of the pivot in your median-􏰆nding algorithm on a problem of size n is always in the range between ⌈n⌉ and ⌊3n⌋. Draw the recursion tree that corresponds to the 44 worst-case running time of the algorithm, and give a tight big-O bound in the algorithm's running time. Also, provide an asymptotic lower bound on the algorithm's running time. SOLUTION: Here's the recursion tree (well, stick): 5 Summing the work at each level, we get: 􏰖? (3)icn = cn􏰖? (3)i. We've left the top of that sum i=0 4 i=0 4 as ? because it turns out not to be critical. This is a converging sum. If we let the sum go to in􏰆nity (which is 􏰆ne for an upper-bound, since we're not making the sum any smaller by adding positive terms!), it still approaches a constant: 1 = 1 = 4. So the whole quantity approaches 4cn ∈ O(n). 1−3 1 44 Unlike the Quicksort algorithm, there's no need to analyze recursion tree structure to conclude that the algorithm takes Ω(n) time. Since the algorithm partitions the n elements of A before any recursive call, it must take Ω(n) time, and thus Θ(n) time. In case you're curious, here's one way to analyse the above sum. Let S = 􏰖∞ (3)i. Then: i=0 4 􏰋∞ 3 i S= (4) i=0 =1+ (4) 􏰋∞ 3 i i=1 􏰋∞ 3i+1 =1+ (4) i=0 3 􏰋∞ 3 i =1+4 (4) i=0 = 1 + 3S 4 Solving S = 1 + 3 S for S, we get 1 S = 1, and S = 4. 44 While recursion trees provide a reliable way to solve pretty much all recurrences that we'll see in this class, it's good to know about alternative methods too. One popular method is the Master theorem, provided in a separate handout. Again, if we assume that we "get lucky" with the choice of pivot at every recursive call, so the size of the subproblem is at most 3n/4, we can express the QuickSelect runtime using the following recurrence (ignoring 􏰇oors and ceilings): 􏰍1 if n = 0 or n = 1 TQS (n) = TQS (3n/4) + cn otherwise This has the form of the Master theorem, where a = 1, b = 4/3, and f(n) = cn. Since logb a = log4/3 1 = 0 (the log of 1 is 0, no matter what the base is), we see that f(n) = cn = Ω(n(logb a+ε)), where we can choose ε to be any positive number that is at most 1. So case 3 of the Master theorem applies, and we can conclude that TQS (n) = Θ(n). 6