BIG O NOTATION Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

what is an assignment statement

A

any statement where a value is set or updated
- fewer assignment statements mean higher efficiency

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

what is tractability

A

tractability refers to how feasible it is to solve a problem efficiently as the input size grows

  • a function is tractable if it is polynomial time complexity or better
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

best to worst functions

A

O(1) – Constant time
O(log n) – Logarithmic time
O(n) – Linear time
O(n log n) – Log-linear time
O(n^k) – Polynomial time
O(2ⁿ) – Exponential time
O(n!) – Factorial time

  • last 2 are intractable
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

rules for big O notation

A
  1. Focus on the Largest Growing Term:
    • For the function f(n) = 5n^2 + 3n + 10, the dominant term is n ^2, so big O representation is is O(n²).
  2. Ignore Constant Factors
    • f(n)=3n and f(n)=10n are both O(n)
  3. Drop Lower-Order Terms
    • For f(n)=n^3+5n^2+20n, the cubic term grows fastest, so the complexity is O(n³)
  4. Non-Dominant Operations
    If an algorithm contains several steps, the step with the highest complexity determines the overall Big O.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

what is big O notation

A
  • a way to measure how the runtime or space requirements of an algorithm grow as the input size increases
  • gives us an idea of how efficiently an algorithm performs when dealing with larger amounts of data.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

why do we use big O

A
  • To understand the scalability of algorithms.
  • To compare different algorithms based on their performance.
  • To ensure that programs can handle larger datasets without becoming too slow.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

O(1) - Constant time complexity

A
  • The algorithm’s runtime does not depend on the size of the input. It always takes the same amount of time.
  • eg, Accessing an element from an array by its index.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

O(logn) - logarithmic time complexity

A
  • The algorithm’s time grows logarithmically. As the input size increases, the time only increases slowly.
    eg - Binary search in a sorted array.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

O(n) - linear time complexity

A
  • The runtime grows linearly with the size of the input.
  • If the input doubles, the time taken also doubles.
    eg - iterating from 1 to a certain number
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

O(n log n) - Linearithmic Time

A
  • This is often seen in efficient sorting algorithms like merge sort or quicksort.
  • The time grows faster than linear but slower than quadratic.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

O(n²) - Quadratic Time

A
  • The runtime grows quadratically.
  • If the input size doubles, the time taken grows by four times.
  • Common in algorithms with nested loops.Example:
    A simple implementation of bubble sort.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

O(2^n) - Exponential Time

A
  • The time grows exponentially
  • The runtime doubles with each additional input
  • Algorithms that involve recursive branching often have this complexity.
    eg - Solving the Fibonacci sequence recursively.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

O(n!) - Factorial Time

A
  • The runtime grows factorially.
  • Extremely slow for even moderate input sizes
  • Common in algorithms that involve generating permutations
    eg - Solving the traveling salesman problem by brute force.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

time complexity for a hash table

A

Inserting, Deleting, and Searching: O(1)
- (O(n) in the worst case due to collisions)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

time complexity for an array

A
  • Accessing an element by index: O(1)
  • Searching for an element: O(n)
  • Inserting an element at the end: O(1) (if there’s space)
  • Inserting at the beginning: O(n)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

time complexity for queues and stacks

A
  • Push (insert), Pop (remove): O(1)
  • Access the top element: O(1)
17
Q

linked list time complexity

A
  • Insertion and Deletion: O(1) (if we have a reference to the node)
  • Searching for an element: O(n)
18
Q

binary tree time complexity

A

Searching, Insertion, and Deletion: O(log n) on average, but O(n) in the worst case (if the tree becomes unbalanced)

19
Q

time complexity for sorting algorithms

A

Bubble Sort: O(n²)
Merge Sort: O(n log n)
Quick Sort: O(n log n) on average (but O(n²) in the worst case)

20
Q

why is big O important

A
  • Choosing Efficient Algorithms: Knowing the time complexity of an algorithm helps us choose the right one for large inputs.
    • eg : Merge Sort (O(n log n)) is more efficient for large datasets than Bubble Sort (O(n²)).
  • Scalability: Algorithms with lower time complexities can handle larger data efficiently.
    • If you need to handle huge datasets, linear or logarithmic algorithms are preferable over quadratic or exponential ones.

-

21
Q

what is algorithmic complexity

A

How much time or resources an algorithm needs as the size of the input increases.

22
Q

what are hardware constraints

A

when the physical hardware of a computer can limit how fast or how much data can be processed.

23
Q

what is a tractable algorithm?

what is an intractable problem

A

A problem is called tractable if it can be solved in polynomial time or better

A problem is called intractable if it cannot be solved in polynomial time.

24
Q

example of an intractable problem

A

The Traveling Salesman Problem (TSP) – Given a set of cities and the distances between them, the problem is to find the shortest route that visits each city and returns to the starting point. The time complexity is O(n!), making it an intractable problem.

25
Q

how do people solve intractable problems

A

heuristic approaches

26
Q

what are undecidable algorithms

A

problems that simply cannot be solved by any algorithm, regardless of how fast or powerful our hardware is

27
Q

example of an undecidable algorithm

A

The Halting Problem – This problem asks whether a computer program will eventually stop running (halt) or continue forever for a given input.

28
Q

why is bubble sort o(n^2)

A

because it involves nested loops where each element is compared with others,
resulting in n comparisons for each of the n elements.

29
Q

what is a heuristic approach

A

knowledge/rules that can be used to find an approximate but not optimal solution to a problem

30
Q

why is merge sort n log n time complexity

A

because the list is divided in half log n times, and merging takes linear time, resulting in n log n operations.

31
Q

why is binary search log n

A

because each comparison halves the search space,

32
Q

why is iterating to a number n time complexity

A

because you have to check each element once in a linear scan.

33
Q

why is accessing an element 1 time complexity

A

because it’s a direct memory access based on index.