Time Complexity Made Easy

Steve K
3 min readDec 21, 2020
Photo by Roman Mager on Unsplash

One thing that will slowly become very important the longer you develop is time complexity it can be complicated at a first glance but will gradually become easier the more you use it like everything else. I spent a great deal of time reading many articles watching tons of videos to fully comprehend the purpose of this formula. My intention is to simplify this concept as much as possible for anyone like me where it took some time for it to click.

So what exactly is time complexity? Simply put it is an analysis we use in programming to calculate how long it takes to run an algorithm give a number of inputs(n), as well as how long it takes to complete the task that the inputs are used in. The common misnomer, as shown above, is Big-O notation. Big O is really a collection of common time complexities and definitions.

O(1)

Working in realtime!

Generally viewed as the most efficient next to one another analysis. In this formula, it takes a single step to complete a task. This is referred to as Constant Time. The idea is that it will always have the same execution time. An example would be accessing a value in an array index or how many minutes it will take to start a new hour.

O(n)

Netflix!

This algorithm is referred to as Linear time. The idea here is that the time to execute is proportional to the input size(n). A good example here would be looking for a show to watch on Netflix where the library size for the application is constantly changing.

O(n²)

Tootsie Pop!

Logarithmic time can be a bit confusing at first but at the base level, this algorithm is proportional to the logarithm of the input size(n). A logarithm is the known quantity it will take for a fixed number (base) to produce the desired number. Similar to that age-old commercial “ how many licks does it take to get to the center of a tootsie pop?” An example as a programmer would be binary search as this is often what is used to search large data sets.

O(n^²)

How many duplicates?

Quadratic time as this is often referred to is similar to logarithmic in the sense that this is proportional to the square of an input size (n). This algorithm is fairly common when dealing with nested iterations. Such as nested for loops and it can go even deeper sometimes going to O(n^³), O(n^⁴), etc. One example I always use here is counting the duplicates in a deck of cards. As for developer examples, popular implementations are bubble sort. selection sorts and insertion sorts.

Conclusion

Finishing up this was meant to make these algorithms as simple as possible and break down the fundamentals for each complexity analysis. This really only scratches the surface of what they are capable of calculating but gives a great foundation for understanding the concept.

--

--