The algorithms behind the Sort function in JavaScript

Have you ever wondered what happens when you call Array.prototype.sort()? What did the creators of JavaScript use to sort items in the language? Today we’re going to answer that with our continuing series on common sorting algorithms. But first, the answers to last article’s Daily Problem.

This article is part of the Data Structures and Algorithms Series following the course The Analysis of Algorithms. If you missed the previous article, check that out first as well as a copy of The Algorithm Design Manual.

Daily Problem

You are given a collection of n bolts of different widths, and n corresponding nuts. You can test whether a given nut and bolt together, from which you learn whether the nut is too large, too small, or an exact match for the bolt. The differences in size between pairs of nuts or bolts can be too small to see by eye, so you cannot rely on comparing the sizes of two nuts or two bolts directly. You are to match each bolt to each nut.

Basically this question is asking “does every bolt have a matching nut?”

  1. Give an O(n^2) algorithm to solve the above problem.

The obvious play here seems to be to test every bolt with every nut, which would look like this:

function match(nuts, bolts) {
  let pairs = [];
  for (nut in nuts) {
    for (bolt in bolts) {
      if (nut.fits(bolt) === "exact match") pairs.push([nut, bolt]);
    }
  }
  return pairs;
}

There are n nuts in a for loop checking every n bolts in a nested for loop, so we can easily determine from our previous discussion on calculating Big O quickly that this algorithm will run in O(n^2) time.

  1. Suppose that instead of matching all of the nuts and bolts, you wish to find the smallest bolt and its corresponding nut. Show that this can be done in only 2n − 2 comparisons.

We start out with n nuts and n bolts. Start comparing two at random. Likely they will not be the same size, and we will get feedback on that. If something is too small for the other, then we keep that item and toss the other piece. So if the nut was too small for the bolt, throw away the bolt, grab another one, and then do another comparison. Essentially you will be ping-ponging between nuts and bolts, throwing away all but the last nut and bolt. Since there are 2n total items, and we throw away all but the last nut (1) and the last bolt (1), which is 2n-2 comparisons.

What do we do if there is the same? Keep one of either the nut or the bolt aside and do another comparison. If the new nut/bolt is smaller than both, we have our new minimum item and we can throw away both. Regardless, we will get a signal from the new item that will eliminate the kept item or the new item drawn.

  1. Match the nuts and bolts in expected O(n log n) time.

This was one of those Daily Problems that would benefit from reading ahead a bit. In fact, the answer to these 2 parts will become very clear once we get through a few of the algorithms from this article.

Mergesort: The divide-and-conquer algorithm behind Firefox array sorting

As I mentioned earlier, there is an algorithm that is so good it is used to power the web browser you may be viewing this blog post on. This algorithm is called mergesort, and it was the preferred algorithm for array sorting in Firefox’s JavaScript engine over heapsort.

Mergesort works by taking advantage of an essential problem solving technique called divide-and-conquer. Practically speaking, divide-and-conquer involves breaking a problem up into a smaller problem to make it solvable.

The way we implement divide-and-conquer with mergesort is to use the power of recursion to halve our problem space until all that’s left is a simple two item comparison. Once we compare and sort the subarray with two items, we build those lists back up, merging these subarrays together until we have unified the whole array again, sorting our items as we build the list back up.

function merge(leftArr, rightArr) {
  let mergedArr = [];

  while (leftArr.length || rightArr.length) {
    if (leftArr[0] <= rightArr[0]) {
      mergedArr.push(leftArr.shift());
    } else {
      mergedArr.push(rightArr.shift());
    }
  }

  while (leftArr.length) leftArr.shift();
  while (rightArr.length) rightArr.shift();

  return mergedArr;
}

function mergesort(arr) {
  return merge(
    mergesort(arr.slice(0, arr.length / 2)),
    mergesort(arr.slice(arr.length / 2, arr.length))
  );
}

Why use mergesort over heapsort

Truth be told, mergesort is a very good general sorting algorithm. Like heapsort, mergesort runs in O(n log n) time. If you have the space to copy the array into the sorted array (which you usually do if you aren’t in constrained environments like graphics programming or IoT devices), mergesort is great. So if it has the same worst-case runtime as heapsort, why would you choose it over heapsort?

Mergesort is also excellent for linked lists. The merge() operation of mergesort can be implemented without requiring extra space.

Mergesort is stable. The only swapping of items happens when you actually sort within mergesort. In heapsort, since we’re using a priority queue, the structure itself is unstable because it’s frequently being rearranged to ensure the maximum/minimum value remains at the root of the heap.

Quicksort: The randomization algorithm powering Chrome’s V8 and IE’s Chakra

It turns out there are other excellent algorithms for sorting. One such algorithm uses the power of randomization to arbitrarily find a pivot point to sort against. We can use that pivot point recursively just like we do with mergesort to continuously sort smaller and smaller problems until the entire list of items has been sorted. This algorithm is called quicksort, and it’s used by both Google and Microsoft for sorting.

Quicksort and mergesort have quite a bit in common. They both build a nested recursion tree all the way down to every n items within a list, each of which takes linear time to process. That means they both generally run in O(nh), where h is equal to the height of the recursion tree that is generated.

Where they differ lies in the randomization of the pivot point. Whereas with mergesort we always subdivide our arrays evenly, in quicksort that point is never known until the function is run.

In the best case, we pivot on the median point, halving the array each time leading us to a runtime of O(log n) and is on par with heapsort and mergesort. In the worst case, we pivot on the end of the array with a completely unbalanced tree that only ever sorts 1 item at a time giving us runtime of O(n^2) and is on par with selection sort.

So if the worst case is so bad, why would we bother using quicksort? The answer lies in the average case random sampling.

With mergesort the best and only case is the exact median of the list. If the list isn’t perfectly unsorted, you could be suboptimally generating your recursion sorting tree.

Similarly, with premeditated data, we know that we can find a configuration that gets us the best- and worst-case scenarios. But since unsorted data can be in any random order, how does quicksort help us?

With quicksort, the middle half of the list (e.g. for [1,2,3,4], that’s [2,3]) is a good enough pivot to not hit the worst-case runtime. The reason is a pivot somewhere in the middle still generates a recursive tree that is some logarithmic order of magnitude (specifically 2n ln n), the kind of speed we’re looking for.

To solidify a O(n log n) runtime, we grab a random permutation of our initial list which takes n time. And since random data can be sorted on average in O(log n), adding in this randomization step guarantees we can provide an Θ(n log n) runtime with any input.

function partition(arr, pivotIndex, lower, higher) {
  let pivotPoint = arr[pivotIndex];
  let partitionIndex = lower;

  for (let i = lower; i < higher; i++) {
    if (arr[i] < pivotPoint) {
      [arr[i], arr[partitionIndex]] = [arr[partitionIndex], arr[i]];
      partitionIndex++;
    }
  }

  [arr[right], arr[partitionIndex]] = [arr[partitionIndex], arr[right]];

  return partitionIndex;
}

function quicksort(arr, lower, higher) {
  let pivotIndex = higher;
  let partitionIndex = partition(arr, pivotIndex, lower, higher);

  quicksort(arr, lower, partitionIndex - 1);
  quicksort(arr, partitionIndex + 1, higher);
}

Why use quicksort

Now we have three algorithms with an expected runtime of O(n log n). Why would we choose quicksort?

Quicksort performs in-place sorting. Like heapsort, quicksort requires no extra space.

Mergesort requires an additional array to copy your merged items into. If you have space constraints, quicksort beats mergesort.

Quicksort beats mergesort for arrays.

Since quicksort takes advantage of random access, we already know that the constant-time access of arrays by index makes it very easy to use a random number and retrieve an item quickly. With linked lists, the random access no longer pays off, since all items must be accessed from either the front or the back of the list.

Quicksort is great for sorting arrays with hundreds of thousands of unique keys.

In fact, when benchmarked against 100,000 unique keys or more, quicksort outperformed mergesort and heapsort by 1.5-2x.

Revisiting the Daily Problem

With the available algorithms at our disposal, this should make the Daily Problem feasible to answer beyond the naive first section.

Match the nuts and bolts in expected O(n log n) time.

With our randomized quicksort algorithnm in place, we can now sort both nuts and bolts by size, and both sets will hvae a pairing because the first element of one matches the first element of the other, and so on all the way up to the nth element. If this still doesn’t make sense, check out an in-depth analysis of randomized quicksort.

Bucketsort: Sorting via distribution

With three sorting algorithms in our optimal band of O(n log n) already, how could we possibly get any better? Consider the following example:

Rather than a list of numbers, you now have a list of names, like in your iPhone. How do you find a name in your phone? If you scroll really quickly, you’ll see a letter highlighting roughly which letter of the alphabet you’re in. From there you’ll pretty much go through the whole alphabet starting with the second letter of the name until you find the person you’re looking for. _This is exactly how bucketsort works (also known as distribution sort or binsort).

Bucketsort is great for evenly-distributed chunks of similar data. If we used bucketsort in the above example, we would first partition the whole phone book into 26 subarrays (well maybe a few less because letters like X and Z have far fewer names than M or S).

I hope you’re seeing a theme here. We once again are subdividing the problem; instead of dividing equally in halves or by a random pivot, we branch a list of n items by k logical partitions based on how the data is arrange. Now obviously we can’t implement a generic algorithm for bucketsort in JavaScript because it highly depends on the dataset. Bucketsort for last names will look very different from bucketsort on tabular accounting data for a business.

Binary search: The algorithm that powers BSTs

Now that we have a gaggle of sorting algorithms to choose from, what can we do with this sorted data to ensure we’re still operating efficiently? Binary search is the log n application for searching sorted data.

All we have to do is start with the middle element. If it’s not the middle element, we ask “is our target item less than or greater than the middle element?” We then choose a side and repeat the process until our target element is found:

function binarySearch(arr, target) {
  let middle = Math.floor(arr.length / 2);

  if (arr[middle] === target) return middle;
  if (arr[middle] < target) {
    binarySearch(arr.slice(middle + 1), target);
  } else {
    binarySearch(arr.slice(0, middle - 1), target);
  }
}

What happens if you have an unbounded sorted array? For example, we can create an unbounded array using ES6 generators. The fibonacci sequence is unbounded, and can be lazily evaluated step-by-step using generators. How would we find an item within such an array?

Meta binary search, or one-sided binary search, to the rescue! This search is twice is slow because it requires two log n searches. The first search is find our element within some bounds. So we start with a single-element array. Then a two-element array. Then a four-element array. And so on and so on, doubling our bounded subarray until we find our element within some boundary. Once we find the element, we can then proceed with our normal binary search.

Bisection method / numerical analysis

Do you have to divide things in half? The bisection method (which you can read more about here) is a way finding the roots of values by repeatedly bisecting an interval until the target is found. In other words, this is a more generalized form of binary search and can operate on the square root, cubed root, or any root of a number.

Numerical analysis is deeply rooted in mathematics. If you’re doing graphics programming or machine learning, you may find yourself utilizing the bisection method.

Bonus sorts: Shellsort & Radix Sort

I wanted to cover two other sorting algorithms really quick because they’re super interesting and are great advanced algorithms for specific situations. They aren’t really covered in the Algorithm Design Manual but they do have practical applications.

Shellsort is like an optimized version of either bubble sort (where you continuously swap smaller elements with larger elements) insertion sort (where you insert unsorted elements into the sorted portion of an array), depending on how you look at it. Shellsort sorts in intervals known as gaps, and iteratively reduces the number of gaps it sorts by until the whole array is sorted.

For example, for an array of a randomized ordering of natural numbers (integers greater than 0) up to 100, shellsort would first sort the numbers [5,10,15,20,…100]. Then it would sort [3,6,9…99]. And that gap interval would reduce down until it got to a gap of 1. But by the time it was sorting against every integer, all of the 5s were sorted, and all of the 3s were sorted, leaving only a fraction of the actual numbers left to sort.

Like bucketsort, shellsort requires a custom gap sequence depending on the application of your sorting algorithm. You can also choose either insertion sort or bubble sort as the inner function to perform on your gap subset. You can find an example here, but like I said, it’s difficult to really nail in the general sense since it is highly dependent on the data you are sorting.

Radix sort (of which there are actually two) is arguably one of the fastest sorting algorithms known. If you really dove into the benchmarking link I posted above, it outperforms every other sorting algorithm we’ve mentioned here so far. Why is that the case?

At the end of the day everything is binary. Integers, strings, floating-point numbers, and everything else on a computer is all represent as binary numbers. Because of that, sorting objects by their binary code is a really convenient way to sort objects. Because of that, radix sort is really two algorithms, where you can sort starting on the least-significant digit or the most-significant digit of our object, now represented as a number.

One great example of using radix sort (specifically the MSD version) would be to find the occurrence of a word using tries to construct a word dictionary and then depth-first search to find the word you are looking for, starting with the first (or most significant) letter.

There are lots of other sorting algorithms that you can explore. As we move on from simple data structures and sorting to graphs and advanced tree searching, we’ll demystify what exactly are things like tries and depth-first search. But this section was exhausting, so let’s end with the Daily Problem and call it a day for now.

Even more practice

If you just can’t get enough of sorting problems, here are some more problems from the book to flex your sorting muscles:

  1. 4-20
  2. 4-22
  3. 4-24

Get the FREE UI crash course

Sign up for our newsletter and receive a free UI crash course to help you build beautiful applications without needing a design background. Just enter your email below and you'll get a download link instantly.

A new version of this app is available. Click here to update.