GOTO All What In The Hell Articles

Big O is a notation for describing the worst-case performance of an algorithm or procedure. This post will show real-world examples of big O notation.

## O(1) – Constant Time

A constant time algorithm will always take the same amount of time to run.

**Example:**

Finding a book in a library.

**Why:**

There are many methods to finding a book in a library, but I’m going to stick with the simplest, ask the librarian. Assuming the librarian is familiar with their library, it should take about the same amount of time to find a book about mathematics as it would about poetry.

For another example see indexing an array.

## O(n) – Linear Time

A linear algorithm runs once for each item(n).

**Example:**

Finding a CD in a stack of CDs

**Why:**

The simplest method of finding a CD in a stack of CDs is to just look at the one on top. If that CD is the one you’re looking for, you’ve found it. If it is not, look at the next one. Repeat until there are no CDs. If the CD we’re looking for is the last one in the stack, we had to look at every CD to find it.

But what about all the times that you look for a CD and the CD you’re looking for is in the middle or on the top? Big O notation describes the worst-case running time of your algorithm. It assumes that every search will be for the last CD (or a CD that doesn’t exist). Big O is not a statistical model of how often a given condition will occur in an algorithm!

## O(log n) – Logarithmic Time

A logarithmic algorithm runs log_{2}(n) times.

**Example:**

Let’s play a guessing game. I’m going to pick a number between 0 and 100 and you have to guess it. You have to try to guess the number in as few guesses as possible. Every time you guess I will tell you if my number is higher or lower.

**Why:**

You could devise a number of strategies for playing this game, however one strategy is particularly good at this type of game, binary search. Each time we guess, we guess the number right in the middle. By doing this we halve the number of possibilities every time, let’s play a little game…

The game above plays out the worst case performance of an O(log n) algorithm like binary search. Instead of making 100 guesses at worst, we only make 7!

## O(n^{2}) – Quadratic Time

A quadratic time algorithm runs n^{2} times, or n times for each n.

**Example:**

Check if there are any duplicates in a deck of cards.

**Why:**

The simplest way to check if there are any duplicate cards in a deck is to pick the first card from the deck. Then compare that card to every other card in the deck. If there are no matches take the second card from the deck. Then compare the second card to every other card in the deck. Continue until you have checked all the cards.

By the time we are done we have looked at each card in the deck 52 times for a total of (52 * 52) 2704 comparisons. There are ways to lessen this number, like not looking through cards that you have already compared. However the most important thing to remember with big O notation is that it always describes the worst-case time, not the average.

Here’s what the process looks like if you don’t re-check cards you’ve already compared.

I’ll come back later and add some more big O notations, so check back. For now I’ll leave you with a question to ponder. What is the big O notation of the diagram above?

## Links

**Sorting** (this will be the topic of another WITH article, but for now…)

Visual Representation of Sorting Algorithms in Javascript

Visualizing Sorting Algorithms

What different sorting algorithms sound like