Quicksort and Mergesort for Objects in JavaScript

Most, if not all of the examples you find on Quicksort and Mergesort algorithms in javascript are sorting simple types.  I decided to write both algorithms so they sort JSON objects.  Please note that if you are sorting large collections of objects, especially with deep hierarchical object graphs, you should seriously consider sorting on the server in C#, Java, PHP, etc.  I did this for fun, and for my own edification.  I hope you find this interesting and useful!

 

 

 

 

 

Mergesort is a divide-and-conquer sorting algorithm.  It starts by dividing the original input into two halves, then sorts each half by recursively calling itself for each half of the unsorted input list. Finally the two sorted halves are merged back together.

The step required to merge the sorted halves back together requires the most effort.  If the total number of items in the two lists to be merged is N, then the merge step requires at most N - 1 comparisons.  There are also N copy operations from the original array to the temporary array and N copies from the temporary list back to the original list.  Because of this, each merge step requires 3N - 1 major operations.

Because Mergesort halves the array, and invokes itself recursively for both halves, if N is a power of 2, that is (N = 2^k), then the recursion goes k = (log base 2) of N levels deep.  If N is not a power of two there is just one extra step, so k = 1 + (log base 2) of N levels.  

Consider the first call to mergesort, and then consider the first two recursive calls that result from that very first call.  Each of those two calls merges N/2 items, right?  It follows that because there are 3N - 1 major operations in the merge, that each of the two calls requires 3 * (N/2) - 1 operations.  2 * (3 (N/2) - 1) = 3N - 2 op's.  

Now let's consider all of the recursive calls required for the algorithm to run to completion.  At the final level of the recursive tree (call it level m), 2^m calls to merge occur.  It follows that each recursive call merges N / (2^m) items.  So, if at level 1 there are 3 * (N/2) - 1 operations, then at level m, there are 3 * (N / 2^m) - 1 operations.  

Hence 2^m * [ (3N / 2^m)  - 1 ]= 3N - 2^m.  This is why Mergesort has been found to have O(N * logN) efficiency.

Quicksort is a conquer-then-divide sorting algorithm.  All of the work is performed before the recursive calls, unlike the divide-and-conquer Mergesort where all of the work is done after the recursive calls.  In Quicksort, The most effort is placed upon the partitioning step.

If the pivot value is the smallest item in the list, loeCollection remains empty.  This is the worse case because gtCollection decreases in size, over time, only by 1 at each recursive call.  There are N - 1 comparisons for the worst case on the first call to Quicksort.  The second call on the worse case requires N - 2, etc.  So, 1 + 2 + ... + (N - 1) = N * (N - 1) / 2.

In addition, N * (N - 1) / 2 copy operations are required.  This is why the worse case of Quicksort efficiency is O(N^2).  For the average case efficiency, when loeCollection and gtCollection contain nearly the same number of items, fewer recursive calls occur.  As in the analysis of Mergesort, you can conclude the average case efficiency is O(N * logN).