Big-O Flashcards
What is “Big-O” ?
The language and metric we use to describe the efficiency of algorithms.
Also called “Asymptotic Runtime”
Refers to how much time or space an algorithm will use, generally based on an input size “n”
It is a rough upper bound for worst cases and large inputs
Common Big-O Runtimes
- O(1) - Constant
- O( log N ) - Logarithmic
- O( N ) - Linear
- O( N log N ) - Log Linear
- O( N2 ) - Quadratic
- O( N3 ) - Cubic
- O( 2N ) - Exponential
- O( N! ) - Factorial
Which has the higher complexity?
O(N)
or
O( 1 )
O( N )
Which has the higher complexity?
O(N)
or
O( log N )
O( N )
Which has the higher complexity?
O(N)
or
O( N log N )
O( N log N )
Which has the higher complexity?
O(N log N)
or
O( 1 )
O( N log N )
Which has the higher complexity?
O(N2)
or
O( 1 )
O( N2 )
Which has the higher complexity?
O( N3 )
or
O( 1 )
O( N3 )
Which has the higher complexity?
O( 1 )
or
O( 2N )
O( 2N )
Which has the higher complexity?
O( N! )
or
O( 1 )
O( N! )
Which has the higher complexity?
O( log N )
or
O( N log N )
O( N log N )
Which has the higher complexity?
O( N2 )
or
O( log N )
O( N2 )
Which has the higher complexity?
O( log N )
or
O( N3 )
O( N3 )
Which has the higher complexity?
O( 2N )
or
O( log N )
O( 2N )
Which has the higher complexity?
O( log N )
or
O( N! )
O( N! )
Which has the higher complexity?
O( N )
or
O( N2 )
O( N2 )
Which has the higher complexity?
O( N3 )
or
O( N )
O( N3 )
Which has the higher complexity?
O( N )
or
O( 2N )
O( 2N )
Which has the higher complexity?
O( N! )
or
O( N )
O( N! )
Which has the higher complexity?
O( N log N )
or
O( N2 )
O( N2 )
Which has the higher complexity?
O( N3 )
or
O( N log N )
O( N3 )
Which has the higher complexity?
O( N log N )
or
O( 2N )
O( 2N )
Which has the higher complexity?
O( N log N )
or
O( N! )
O( N! )
Which has the higher complexity?
O( N3 )
or
O( N2 )
O( N3 )
Which has the higher complexity?
O( N2 )
or
O( 2N )
O( 2N )
Which has the higher complexity?
O( N! )
or
O( N2 )
O( N! )
Which has the higher complexity?
O( 2N )
or
O( N3 )
O( 2N )
Which has the higher complexity?
O( N3 )
or
O( N! )
O( N! )
Which has the higher complexity?
O( 2N )
or
O( N! )
O( N! )
Three Major
Complexity Notations
Big O
- O( f(n) )
- Asymptotic Upper bounds
- Algorithm AT LEAST that fast
Big Omega
- Ω( f(n) )
- Asymptotic Lower Bounds
- Algorithm will run AT MOST that fast
Big Theta
- 𝚯( f(n) )
- Combines O and Theta
- Indicates that algorithm is O( f(n) ) AND Ω( f(n) )
- Casual use of Big O often really means Big Theta
Three general
Input Cases
- Best Case
- Input is organized in a way that the algorithm runs at the fastest possible speed, with the lowest possible complexity
- Example: Already sorted array into sorting algorithm
- Worst Case
- Input organized so that algorithm runs at the slowest speed possible, with the highest complexity
- Example: Reverse sorted array into sorting algorithm
- Expected/General Case
- What the vast majority of real world input will look like
Two Types
of
Complexity
Time Complexity
Space Complexity
Space Complexity
- The space (typically memory) used by an algorithm
- Also described with Big O notation
Example:
- Array size n -> O(n)
- 2D Array size n x n -> O(n2)
- Call Stack: n recursive calls → O(n)
Space Complexity:
Recursion
vs
Iterative(loops)
Space Complexity only increases when memory is being used simultaneously.
- Recursive calls increase complexity, because they add to the stack
- Loops(Iterative) do not increase complexity, memory in the scope of the loop is destroyed each iteration
General Formulation
of
Big O
(steps)
- Describe the Worst Case Input
- Step through the algorithm, add or multiply runtime of sections
- Simplify by dropping constants:
- O(2n) = O(n)
- Further simplify by only using the most dominant(fastest growing) term:
- O(n2 + n) = O(n2)
- Consider for each type of operation
- Simplify amortized cost
Types of Code Sections
and their Complexity
Code Section Type
Complexity
Sequential Instruction
O( 1 )
Single Pass through Array/Set
O( n )
Distinct Sequential Loops (e.g. two separate arrays)
O(A + B)
Nested Loops
O(A * B)
Elements Halved (eg Binary Search)
O( log n )
Recursive Function with Multiple calls
O(branchesdepth)
Complexity of Code Section Types:
Sequential Instruction
O(1)
Constant
Complexity of Code Section Types:
Single Pass
through Array/Set
O( n )
Linear
Complexity of Code Section Types:
Distinct Sequential Loops
O( A + B)
Where A and B are the number of iterations in two distinct loops
Complexity of Code Section Types:
Nested Loops
O( A * B)
Where B is the loop nested inside of loop A
Complexity of Code Section Types:
Elements halved
(eg Binary Search)
O( log N)
Logarithmic
Complexity of Code Section Types:
Recursive Function
w/ multiple calls
O( branchesdepth)
Ex: O( 2N )
Multipart Algorithms:
What is the complexity of this code:
for( a : array A) {}
for( b : array B) {}
In this case, Add the complexity:
O( A + B)
Multipart Algorithms:
What is the complexity of this code:
for( a : array A) {
for( b : array B) {}
}
Nested Loop: Multiply
O( A * B)
Special Case:
Amortized Time
It allows us to describe that a worst case happens every once in a while. But once it happens, it won’t happen again for so long that the cost is “amortized”.
A costly operation that happens infrequently can effectively be ignored, especially if it becomes less frequent over time.
eg
A Dynamic ArrayList doubles it’s capacity when capacity is reached. Normal insertions are O(1), but insertion with growth is O(N).
The Amortized Time is O(1)
O( log N )
Runtimes:
Explanation
Log N comes from repeatedly halving an array or set.
One example is Binary Search.
The total runtime is then a matter of how many steps (dividing N by 2 each time) we can take until N becomes 1.
N/2 + N/4 +…+N/2k
At one element: 2k = N
and k is our number of steps
So, k = log2N
In Big O, the base of log doesn’t matter as the growth rate is similar, so it’s just O(log N)
Recursive Runtimes:
Explanation
The runtime is the sum of each level of recursion.
For a function:
f(n) { return f(n-1) + f(n-1) }
Each level branches twice:
2n + 2n-1 + … + 21 + 20 = 2n+1 + 1
Applying reductions: O(2n+1 + 1) = O(2n)
Recursive Runtimes:
Call Tree
A technique for modeling recursive functions,
each node on the tree represents a call or stack frame.
Used with a table to count the number of calls per node.
Runtime is the sum of each level of recursion
