Friday, June 17, 2011

Something about Quicksort

     1. Why the average time complexity is O(NlogN)?
  • Think the whole process as a tree-like structure, in each iteration, quicksort can divide the problem in half, so the depth of the tree is logN, then in each depth level, essentially N comparison needs to be done. Therefore the average time complexity is O(NlogN).
  • Use Master Theory, T(N) = 2*T(N/2) + N. Then we have T(N/2) = 2*T(N/4) + N/2. Therefore, we can derive T(N) = 4*T(N/4) + 2*N = 8*T(N/8) + 3N = .... Then eventually we can have T(N) = a*T(1) + b*N, where a = O(2^(logN)), b =O(logN). So  T(N) = O(NlogN).
   2. When is the worst case O(N^2)?
  • if in each iteration, we always happen to select the smallest (or biggest, depends on how you do quicksort), then you can't effectively break the input array into two sub arrays. When you have N elements in the array, you can only get an array of size 1 and an array of size N-1. Then the cost is O(N^2). Especially when you try to sort a sorted list and you always choose the lowest index as pivot, you will always have O(N^2).

No comments:

Post a Comment