Basic Algorithms 101

Basic Algorithms 101

When we start programming, the first thing we learn are usually very simple algorithms, and although at that moment.

Whether we are looking at how a button displays or the latest trends in the framework of the day (we'll talk about frameworks later) is a kind of knowledge that we must push to the limit.

"What differentiates the masters from the amateurs is the unrivalled mastery of the fundamentals".

If you want to join a big four company (Google, Facebook, Amazon, Microsoft) they will ask you for a deep knowledge of algorithms and it is not because they are fancys or because they like that you know how to solve weird problems but because being high traffic and high availability developments they cannot afford that a process takes too much time to run or consumes too many resources.

The problem you have here is the following: what will happen when you have very specific requirements, or your feature is rejected for 7 milliseconds more per request and all you find on the internet are nested cycles or things like that, that people with the best of wills try to help you but the only one responsible for winning that contest is you.

All you have to do to get away with that challenge is to remember your lessons with the old sage about algorithms.

If you don't know what an Algorithm is, you're in the right place because I'm going to explain how it works.

What is an algorithm?

An algorithm is a set of operations or instructions that allow you to find the solution to a problem. Whatever the nature, it could be making a cup of coffee, getting out of your house or ordering a set of integers from smallest to largest.

When we sit down to design an algorithm, it is our task to define the series of steps to be followed in order to accomplish our task.

So far so good, anyone can design an algorithm, now let's give it a brief dose of reality. The algorithm must not only exist, it must also solve the problem efficiently, it is not enough that it only solves the problem.

How do you know how efficient an algorithm is?

We have already written a program that solves something, now I want to know how to measure its efficiency:

  • Number of lines of code? NO
  • Amount of time it takes to run? NO.

The number of lines of code has no effect on the performance of a program and the amount of execution time is closely linked to the computer used to run the program, it is not the same to test a software on a PC from the 90's as it is to test it on a current PC.

Normally the "Big O" notation is used in which constants are assumed as the number of machine instructions and the average number of machine instructions a specific computer executes.

There is a lot of theory to digest about how to deduce the Order of a function and I am not going to wade into those murky waters, so for the moment I will just list common Orders.

– O(1) -> var a = arr[3]

 – O(log n) -> Dividir el problema en 2 hasta encontrar la solución

 for(var i = 100; i > 0; i/=2){console.log(i)}

 – O(n) -> Como máximo se ejecutará n veces

var a = 9

var n = 10

for (var i = 0; i<n; i++){

  if (i === a){

    console.log(‘YEAH LO ENCONTRÉ’)

  }

}

– O(n log n) -> Una instrucción de Órden O(log n) dentro de una O(n)

– O(n^2) -> Una instrucción de Órden O(n) dentro de otra de Órden(n)

var n = 100

for (var i=n; i<=n; i++){

  for (var j=n; j<=n; j++>){

    console.log(i, j)

  }

}

And there are more extreme cases such as O(n^3), O(2^n), O(n^n), O(n!) but I will disregard them as they are of very limited use.

There is one thing I would like to clarify: "If we know that there is going to be little data and that is NEVER GOING TO CHANGE, it doesn't matter what order of algorithm we use".

Paradigms

Brute force

It consists of listing all possible solutions to a problem.

Greedy

The current optimal solution is chosen without considering anything in the future.

Divide and conquer

Dividing the problem into smaller parts until the solution is found.

Dynamic programming

The problem is divided into smaller subproblems and the result of these subproblems is used to solve the main problem, e.g. a recursive function.

Backtraking

It is similar to Brute Force, it tries to generate all possible solutions and every time you generate a new solution you ask if it satisfies all the conditions, only if it is true, it continues to generate sub-solutions.

if it does, it keeps generating subsequent solutions. If not, go back and try a different path.

Branch and Bound

Similar to Backtracking only that if the conditions are not optimal it stops going into the subnodes, goes back and tries a different path.

Knowledge about algorithms will give you the ability to solve types of problems in an efficient way and will make you a better developer.

If you want me to publish more articles with examples let me know .

The last one turns off the light.