BIG O NOTATION Flashcards
what is an assignment statement
any statement where a value is set or updated
- fewer assignment statements mean higher efficiency
what is tractability
tractability refers to how feasible it is to solve a problem efficiently as the input size grows
- a function is tractable if it is polynomial time complexity or better
best to worst functions
O(1) – Constant time
O(log n) – Logarithmic time
O(n) – Linear time
O(n log n) – Log-linear time
O(n^k) – Polynomial time
O(2ⁿ) – Exponential time
O(n!) – Factorial time
- last 2 are intractable
rules for big O notation
- Focus on the Largest Growing Term:
- For the function f(n) = 5n^2 + 3n + 10, the dominant term is n ^2, so big O representation is is O(n²).
- Ignore Constant Factors
- f(n)=3n and f(n)=10n are both O(n)
- Drop Lower-Order Terms
- For f(n)=n^3+5n^2+20n, the cubic term grows fastest, so the complexity is O(n³)
- Non-Dominant Operations
If an algorithm contains several steps, the step with the highest complexity determines the overall Big O.
what is big O notation
- a way to measure how the runtime or space requirements of an algorithm grow as the input size increases
- gives us an idea of how efficiently an algorithm performs when dealing with larger amounts of data.
why do we use big O
- To understand the scalability of algorithms.
- To compare different algorithms based on their performance.
- To ensure that programs can handle larger datasets without becoming too slow.
O(1) - Constant time complexity
- The algorithm’s runtime does not depend on the size of the input. It always takes the same amount of time.
- eg, Accessing an element from an array by its index.
O(logn) - logarithmic time complexity
- The algorithm’s time grows logarithmically. As the input size increases, the time only increases slowly.
eg - Binary search in a sorted array.
O(n) - linear time complexity
- The runtime grows linearly with the size of the input.
- If the input doubles, the time taken also doubles.
eg - iterating from 1 to a certain number
O(n log n) - Linearithmic Time
- This is often seen in efficient sorting algorithms like merge sort or quicksort.
- The time grows faster than linear but slower than quadratic.
O(n²) - Quadratic Time
- The runtime grows quadratically.
- If the input size doubles, the time taken grows by four times.
- Common in algorithms with nested loops.Example:
A simple implementation of bubble sort.
O(2^n) - Exponential Time
- The time grows exponentially
- The runtime doubles with each additional input
- Algorithms that involve recursive branching often have this complexity.
eg - Solving the Fibonacci sequence recursively.
O(n!) - Factorial Time
- The runtime grows factorially.
- Extremely slow for even moderate input sizes
- Common in algorithms that involve generating permutations
eg - Solving the traveling salesman problem by brute force.
time complexity for a hash table
Inserting, Deleting, and Searching: O(1)
- (O(n) in the worst case due to collisions)
time complexity for an array
- Accessing an element by index: O(1)
- Searching for an element: O(n)
- Inserting an element at the end: O(1) (if there’s space)
- Inserting at the beginning: O(n)