{ Intermediate Sorting Algorithms. }

Objectives

By the end of this chapter you should be able to:

  • Define the limitations of simpler sorting algorithms like bubble, insertion, and selection sort
  • Understand and implement merge sort
  • Understand and implement quick sort
  • Compare and contrast faster sorting algorithms like merge and quick sort

Introduction

In the previous chapter, we learned about three algorithms for sorting an array of numbers in JavaScript: bubble sort, insertion sort, and selection sort. These algorithms are relatively beginner-friendly when it comes to sorting algorithms, but they're also not the most efficient: all of them are typically O(n2).

The algorithms we'll learn about in this chapter are more efficient. However, this efficiency comes at a cost, as the algorithms are a more complicated and tend to take more time to understand. But with time and after working out some examples, you should be able to explain merge sort and quick sort, in addition to the algorithms you already know!

Merge Sort

A more complex algorithm for sorting an array of elements is merge sort. Before we see the algorithm, let's visualize how it works:

merge sort GIF

As you can see, the algorithm involves splitting the array into smaller subarrays. To be more precise, the algorithm is as follows:

  • Break up the array into halves until you can compare one value with another
  • Once you have smaller sorted arrays, merge those arrays with other sorted pairs until you are back at the full length of the array
  • Once the array has been merged back together, return the merged (and sorted!) array

Here is a nice GIF of the algorithm in action; you can also see an excellent visualization here

merge sort gif

So how does the algorithm perform in terms of its time complexity? Once the array has been broken down into one-element subarrays, it takes O(n) comparisons to get two-element merged subarrays. From there, it takes O(n)
comparisons to get four-element merged subarrays, and so on. In total it takes O(log(n)) sets of O(n) comparisons, since the logarithm roughly measures how many times you can divide a number by 2 until you get a number that is 1 or less. Therefore, the time complexity for merge sort is O(n log(n)), which is significantly better than the complexity of bubble, insertion, and selection sort!

Even in the best case, merge sort is O(n log(n)). In the worst case, it's O(n log(n)) too. Basically, whenever you think about merge sort, you should think O(n log(n)).

When trying to implement merge sort, it's helpful to first write a function that takes two sorted arrays and merges them together. Merge sort then works by splitting the array in half, recursively calling itself on each half, and then using the merge helper to merge the two sorted halves back together.

Quick Sort

Another more complex algorithm to use for sorting is quicksort. Unfortunately, quicksort is not the most intuitive of algorithms and has a wide range of implementations. Before continuing, check out this great video from Computerphile for a quick introduction to how quicksort works.

The algorithm is as follows:

  • Pick an element in the array and designate it as the "pivot". While there are quite a few options for choosing the pivot. We'll make things simple to start, and will choose the pivot as the first element. This is not an ideal choice, but it makes the algorithm easier to understand for now.
  • Next, compare every other element in the array to the pivot.
    • If it's less than the pivot value, move it to the left of the pivot.
    • If it's greater, move it to the right.
  • Once you have finished comparing, the pivot will be in the right place
  • Next, recursively call quicksort again with the left and right halves from the pivot until the array is sorted

Here is a nice GIF of the algorithm in action; you can also see an excellent visualization here

quicksort gif

Like merge sort, quicksort typically runs at O(n log(n)), and even in the best case is O(n log(n)). But in the worst case - if the pivot is chosen to be the leftmost element and the array is already sorted, for instance - the runtime will be O(n2).

Also like merge sort, it's easiest to implement quicksort with the aid of a helper function. This function is responsible for taking an array, setting the pivot value, and mutating the array so that all values less than the pivot wind up to the left of it, and all values greater than the pivot wind up to the right of it. It's also helpful if this helper returns the index of where the pivot value winds up.

For example, if we call this function pivot, and if we decide the pivot will always be the first element in the array, it should behave in the following way:

var arr = [4, 2, 5, 3, 6];
pivot(arr); // 2
arr; // [3, 2, 4, 5, 6];

In this code, the specifics of how the arr variable gets mutated are not important. All that matters is that 4 winds up at index 2, with 3 and 2 to the left of it (in any order), and with 5 and 6 to the right of it (in any order).

Exercises

As with the other sorting algorithms we've studied, the best way to try to understand merge sort and quicksort is to implement them yourself. Complete Part II of the exercises here.

When you're ready, move on to Arrays Revisited

Continue

Creative Commons License