Algorithms

Learn about the efficiency and trade-offs of the different algorithms that you have available.

Time complexity

This should be a whole lesson on its own, but we'll try to keep it brief. In simplistic terms, the time complexity represents how long a specific algorithm will take. They are represented in function of "n", where n is the number of variables to be processed by the algorithm.

The data size dependency is the most important part of the time complexity, because depending on it the double of data might take 10x longer or the same time.

Let's see it with a couple of easy examples. If you want to find out how many times the number "2" is contained in this array, you can just check each member to see if it's a 2:

[7, 3, 7, 9, 4, 2, 5, 7, 2, 3, 9, 3, 0, 2];

That means that the complexity of this search algorithm is \(O(n)\), where the "O" (called Big O) is the standard way of representing the complexity. Now let's say that we want to find if the number "2" is present at least once in the same array as above.

After reaching the first "2" you can stop counting as the condition is satisfied. The average time for finding a number for a unique random array (the previous one is not unique) following this method would be n/2, since there's a 50% of probability it's in the first half and a 50% that it's in the second half.

However, the time complexity is always about the worst case. It could be in the last position or not present at all, and in that situation you'd have to check all N elements. So we can conclude that the time complexity for this operation is O(n) as well.

Let's see a different example. We want to know what is the value of a specific item in an array or an object, so we'll access it like this:

const arr = [7, 5, 9, 1, 3, 4, ,7 8, 4, 5, 2];
const key = 3;
const value = arr[key];

This is a really fast operation. In fact, it's \(O(1)\), which is basically the fastest that you can get. It is useful for many-reads and few-writes and you can see how something like this might be implemented with DB's key indexes and it's how things like Redis work.

Trade-offs

In  many situations there is a trade-off of speed vs memory size (RAM or disk). With the previous DB index example, those indexes must be stored somewhere and might some times expand to occupy a significant portion of the memory.

When Google's V8 processes javascript with their JIT compiler they have to choose between optimizing for quick bootup time or a jitter-less experience. Hopefully after you finish this lesson you'll be able to help JS engines with optimized code.

Finally in the trade-offs but really important, you should only optimize the code that is slow. Programming speed and code legibility are features, so before getting to optimize anything first analyze where the bottleneck is.

Sorting algorithms

We will be sorting an array of numbers from lowest to highest in all of our examples.

Selection sort

It is one of the simplest and most intuitive sorting algorithms. For this sort you have to find the minimum in the unsorted array and put it at the last position available of the sorted array (for the first one at the beginning).

// Initially
const unsorted = [3, 9, 1, 6];
const sorted = [];

// 1st step
const unsorted = [3, 9, 6];
const sorted = [1];

// 2nd step
const unsorted = [9, 6];
const sorted = [1, 3];

// 3rd step
const unsorted = [9];
const sorted = [1, 3, 6];

// 4th step
const unsorted = [];
const sorted = [1, 3, 9];

This has however a big trade-off. You have to go through the whole unsorted list each time to find the smallest number, so the time complexity is \(O(n^2)\). The reason is that we have N steps, and for each step we perform N operations.

Let's see this example in Javascript:

const selectionSort = unsorted => {
  const sorted = [];

  // Keep going until unsorted is empty
  while(unsorted.length) {
    let min = Infinity;
    unsorted.forEach((num, i) => {
      if (num <= min) {
        min = num;
      }
    });

    // Get the minimum index
    let index = unsorted.indexOf(min);

    // Remove the element from unsorted and add it at the end of sorted
    sorted.push(unsorted.splice(index, 1));
  }

  return sorted;
};

alert(selectionSort([3, 9, 1, 6]));

Bubble sort

Now for one of the most infamous ones, you might be asked to perform a Bubble sort in a whiteboard on a Silicon Valley interview.

Let's get real though. Bubble sort goes through each pair of elements and orders those as it goes leaving the largest one on the right. Let's see it as before:

// Initially
const list = [3, 9, 1, 6];

// 1st step, compare 3 and 9 and do not swap them
const list = [3, 9, 1, 6];

// 2nd step, compare 9 and 1 and swap them
const list = [3, 1, 9, 6];

// 3rd step, compare 9 and 6 and swap them
const list = [3, 1, 6, 9];

// 4th step, compare 3 and 1 and swap them
const list = [1, 3, 6, 9];

This algorithm has also \(O(n^2)\) complexity. However, two algorithms with the same Big O notation might be quite different. Let's go for the code:

const bubbleSort = list => {
  for (let j = 0; j < list.length; j++) {
    for (let i = 0; i < list.length - 1; i++) {
      if (list[i] > list[i + 1]) {
        let temp = list[i];
        list[i] = list[i + 1];
        list[i + 1] = temp;
      }
    }
  }
  return list;
};

alert(bubbleSort([3, 9, 1, 6]));

Insertion sort