History: Algorithms

Revision made 7 years ago by Francisco Presencia. Go to the last revision.

Learn about the efficiency and trade-offs of the different algorithms that you have available.

Time complexity

This should be a whole lesson on its own, but we'll try to keep it brief. In simplistic terms, the time complexity represents how long a specific algorithm will take. They are represented in function of "n", where n is the number of variables to be processed by the algorithm.

The data size dependency is the most important part of the time complexity, because depending on it the double of data might take 10x longer or the same time.

Let's see it with a couple of easy examples. If you want to find out how many times the number "2" is contained in this array, you can just check each member to see if it's a 2:

[7, 3, 7, 9, 4, 2, 5, 7, 2, 3, 9, 3, 0, 2];

That means that the complexity of this search algorithm is \(O(n)\), where the "O" (called Big O) is the standard way of representing the complexity. Now let's say that we want to find if the number "2" is present at least once in the same array as above.

After reaching the first "2" you can stop counting as the condition is satisfied. The average time for finding a number for a unique random array (the previous one is not unique) following this method would be n/2, since there's a 50% of probability it's in the first half and a 50% that it's in the second half.

However, the time complexity is always about the worst case. It could be in the last position or not present at all, and in that situation you'd have to check all N elements. So we can conclude that the time complexity for this operation is O(n) as well.

Let's see a different example. We want to know what is the value of a specific item in an array or an object, so we'll access it like this:

const arr = [7, 5, 9, 1, 3, 4, ,7 8, 4, 5, 2];
const key = 3;
const value = arr[key];

This is a really fast operation. In fact, it's \(O(1)\), which is basically the fastest that you can get. It is useful for many-reads and few-writes and you can see how something like this might be implemented with DB's key indexes and it's how things like Redis work.

Trade-offs

In  many situations there is a trade-off of speed vs memory size (RAM or disk). With the previous DB index example, those indexes must be stored somewhere and might some times expand to occupy a significant portion of the memory.

When Google's V8 processes javascript with their JIT compiler they have to choose between optimizing for quick bootup time or a jitter-less experience. Hopefully after you finish this lesson you'll be able to help JS engines with optimized code.

Finally in the trade-offs but really important, you should only optimize the code that is slow. Programming speed and code legibility are features, so before getting to optimize anything first analyze where the bottleneck is.

Sorting algorithms

We will be sorting an array of numbers from lowest to highest in all of our examples.

Selection sort

It is one of the simplest and most intuitive sorting algorithms. For this sort you have to find the lowest number in the unsorted array and put it at the last position of the sorted array so far (initially at the beginning).

// Initially
const unsorted = [3, 9, 1, 6];
const sorted = [];

// 1st step
const unsorted = [3, 9, 6];
const sorted = [1];

// 2nd step
const unsorted = [9, 6];
const sorted = [1, 3];

// 3nd step
const unsorted = [9];
const sorted = [1, 3, 6];

// 4nd step
const unsorted = [];
const sorted = [1, 3, 9];

This has however a big trade-off. You have to go through the whole unsorted list each time to find the smallest number, so the time complexity is \(O(n^2)\). The reason is that we have N steps, and for each step we perform N operations.

Bubble sort

Insertion sort