Understand Big O notation in 7 minutes

The Big O notation is a notion often completely ignored by developers. It is however a fundamental notion, more than useful and simple to understand. Contrary to popular belief, you don’t need to be a math nerd to master it. I bet you that in 7 minutes, you’ll understand everything.

What is the Big O notation ?

The Big O notation (or algorithm complexity) is a standard way to measure the performance of an algorithm.** It is a mathematical way of judging the effectiveness of your code.** I said the word mathematics and scared everyone away. Again, you don’t need to have a passion for math to understand and use this notation.

This notation will allow you to measure the growth rate of your algorithm in relation to the input data.** It will describe the worst possible case for the performance of your code.** Today, we are not going to talk about space complexity, but only about time complexity.

And it’s not about putting a timer before and after a function to see how long it takes.

big o notation

The problem is that the timer technique is anything but reliable and accurate. With a simple timer the performance of your algo will change greatly depending on many factors.

  • Your machine and processors
  • The language you use
  • The load on your machine when you run your test

The Big O notation solves all these problems and allows us to have a reliable measure of the efficiency of all the code you produce. The Big O is a little name for “order of magnitude”.

#technical #understand #big o notation #big data

Understand Big O notation in 7 minutes
1.10 GEEK