Vous êtes sur la page 1sur 4

CSCE 221 Cover Page Homework Assignment #3 Due Oct.

5 at midnight to CSNet
First Name Daniel Last Name Whatley UIN 122004818 User Name dwhatley E-mail address dwhatley@tamu.edu
Please list all sources in the table below including web pages which you used to solve or implement the current homework. If you fail to cite sources you can get a lower number of points or even zero. Please read more on the Aggie Honor System Oce webpage at: http://aggiehonor.tamu.edu/

Type of sources People Web pages (provide URL) Printed material Other Sources

Instructors, TAs, other students http://piazza.com/ Course Textbook Dev-C++ IDE

Course Website Introduction to Algorithms, CLRS

Wikipedia

I certify that I have listed all the sources that I used to develop the solutions/codes to the submitted work. On my honor as an Aggie, I have neither given nor received any unauthorized help on this academic work. Your Name Daniel Whatley Date 9/27/2012

Homework 3, Daniel Whatley

Homework 3
1. Here is the code:

int length(ListNode *ln) { if(ln==NULL) return 0; else return 1+length(ln->next); }

Since we are performing a constant number of operations for every pass through the recursive function (at most 1 comparison, 1 addition, 1 access to next and 1 return statement), and n passes through the recursive function (1 for each node), the running time of this algorithm would be O(n). 2. Here is the code:

void popAll(int start) { if(start==-1) return; //if the stack is empty else { pop(); popAll(--start); } }

Calling popAll(size-1) starts the function execution. Let T (n) be the running time of the popAll function for input n. We have that

T (n) =

c1 T (n 1) + c2

if n = 0, if n > 0,

assuming that the pop operation and the return statement run in constant time. c2 denotes running time of the pop operation. Using the substitution method, we guess that the function runs in O(n) time. Since the former case is obviously bounded above by O(n), we only need to consider the latter case. We start by hypothesizing that this bound holds for n 1, or T (n 1) = O(n 1). Substituting gives us T (n) = O(n 1) + c2 , which holds if c2 = O(1), which is true (the pop function does indeed run in constant time). Now we use the recursion tree method. At the very top we have T (n) expressed as 1 + T (n 1), so we place the 1 on top and put T (n 1) as the node branching out from it. And then we place 1 in that spot and keep going, and our tree looks something like this:

CSCE 221 (Data Structures and Algorithms)

And the function is obviously O(n). 3. 4. Three main steps of quicksort: Divide: This process takes O(n) time, because we must do comparisons of every element in the array with the pivot point. Sort: This process takes 2T (n/2), since we are recursively calling quick-sort on two sub-arrays of size n/2. Merge: No work is required to merge these two sub-arrays together, as they are already sorted in place. Best case: The best case occurs when the pivot point is exactly the middle of the array (n/2 elements on one side, n/2 elements on the other). There is no specic order of the array when this occurs. So, our running time T (n) is: T (n) = c1 2T (n/2) + c2 n if n = 0, if n > 0.

We rst use the recursion-tree method. Each parent node, which is c2 n, branches to two children nodes, which are T (n/2). So, the top level is c2 n, and the second level is also c2 n (because there are 2 nodes whose time is c2 n/2), and so on until the last level is c1 n. And since there are log2 n + 1 levels, we have (n)(log n + 1) = (n log n), which is the best-case running time. Finally, using the master method, we see, using the heuristic T (n) = aT (n/b) + f (n), that a = 2 and b = 2. Since logb a = 1, and f (n) = (n1 log0 n), the running time is (nlogb a log1 n) = (n log n). Thus, the best case runs in n log n time. Worst case: The best case occurs when the pivot point is exactly at one end of the array (n-1 elements on one side, 0 elements on the other), and this happens when the array is sorted or sorted in reverse order. So, our running time T (n) is: T (n) = c1 T (n 1) + c2 n if n = 0, if n > 0.

This is almost the same problem as problem 2, except we have T (n1)+O(n) instead of T (n1)+O(1), so our answer increases by a factor of n. So, quick-sort runs in O(n2 ) in the worst case.

Homework 3, Daniel Whatley

5. Merge sort has 3 operations: divide the array into two sub-arrays, sort the sub-arrays, and merge the sorted sub-arrays. Let the running time of an array size n be T (n). There is no best case or worst case for merge sort because our pivot point (point where we decide the two sub-arrays) does not change it is always in the middle of the array. Divide: This process takes O(1) time, as the function just takes a start parameter and an end parameter, so just computing the middle element of the array suces. Sort: This process takes 2T (n/2), since we are recursively calling merge-sort on two sub-arrays of size n/2. Merge: This process takes O(n) time. So, our function is T (n) = c1 2T (n/2) + c2 n if n = 1, if n > 1,

assuming that solving the problem of size 1 takes constant time (which it should, as it is just a simple comparison for merging). Note that this is almost the exact same problem as best case of quick sort, so merge sort should run in n log n average case. We rst use the recursion-tree method. Each parent node, which is c2 n, branches to two children nodes, which are T (n/2). So, the top level is c2 n, and the second level is also c2 n (because there are 2 nodes whose time is c2 n/2), and so on until the last level is c1 n. And since there are log2 n + 1 levels, we have O(n)(log n + 1) = O(n log n) in all cases. Finally, using the master method, we see, using the heuristic T (n) = aT (n/b) + f (n), that a = 2 and b = 2. Since logb a = 1, and f (n) = O(nlogb a log0 n) = O(n), the running time is O(nlogb a log1 n) = O(n log n). Thus, for any case, merge sort runs in n log n time.

Vous aimerez peut-être aussi