Computational Time Complexity Flashcards

1
Q

What is computational time complexity?

A

Time complexity is a measure of the amount of time an algorithm takes to run as a function of the input size

It helps us analyze and compare the efficiency of different algorithms

The Big O notation is commonly used to represent time complexity.

Understanding time complexity is crucial for designing scalable and high-performance software

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

When considering the “worst” case scenario…

How do they analyze this?

A

It is the most common (and useful) analysis in most
cases

It provides a guarantee that the algorithm will not perform worse than a certain level, regardless of the input

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the “Big O” in time complexity?

A

It is used to talk about how fast an algorithm runs in the worst case

Big O notation is used because the growth rate of a function is also referred to as the order of the function

This is crucial for critical applications where: = performance guarantees are essential

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are 4 aspects used to determine an algorithms computational time complexity?

A
  1. The input size is a key factor in analyzing time complexity.
  2. Determine which parameter(s) affect the running time as the input size increases
  3. Iterative Algorithms:
    - count the number of iterations
    - analyze how the number of iterations changes with the input size
    - the loop structure determines the time complexity
  4. Nested Loops:
    - by counting the number of iterations of each loop
    - determine how the loop variables change and how they relate to the input size
    - multiply the iterations to obtain the overall time complexity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Why is there a focus on dominant factors in time complexity of algorithms?

A

Only the highest order (or dominant term) is considered because it provides:
= the most significant information about the algorithm’s scalability and efficiency as the input size increases
= as the input size increases, the impact of lower-order terms and constant factors becomes negligible compared to the dominant term

The dominant term has the most significant influence on the algorithm’s time complexity and provides the primary factor that determines how the algorithm scales with larger inputs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What does the linear search for time complexity look like?

A

Process: Starts at the beginning, compares each element with the target until found or end of list

Best Case: Target is first element → 1 comparison

Average Case: Target is anywhere → n/2 comparisons on average

Worst Case: Target not found or last element → n comparisons

IN LINEAR SEARCH TIME COMPLEXITY IS = O(n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What does binary search for time complexity look like?

A

Process: Repeatedly divide the sorted list in half, compare middle element with target

Each Step: Reduces search space by half

After k comparisons → n / 2ᵏ items left

In worst case stops when: Only 1 item remains → n / 2ᵏ = 1

Solve for k: k = log₂(n)
BINARY SEARCH TIME COMPLEXITY = O(log n)

NOTE: List must be sorted

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What does bubble sort for time complexity look like?

A

Process: Repeatedly compare and swap adjacent elements to “bubble” largest values to the end

Loops:
Outer loop: Runs n - 1 times (passes)
Inner loop: Compares adjacent elements, swaps if needed

Comparisons:
Total = n(n - 1)/2

Worst Case: Also up to n(n - 1)/2

BUBBLE SORT TIME COMPLEXITY: O(n²)

Best Case (sorted list): O(n) (with optimization)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What does short bubble sort look like for time complexity?

A

Optimization: Stops early if no swaps in a pass → list is sorted

Best Case (sorted list): O(n)

Average Case: Improved over basic Bubble Sort (varies)

Worst Case (fully unsorted): Still O(n²)

Key Benefit: Faster on nearly sorted data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What does selection sort look like for time complexity?

A

Process: In each pass, find the minimum element and swap it to its correct position

Comparisons: n(n - 1)/2 → same as Bubble Sort

Swaps: Exactly n - 1 (one per pass)

Time Complexity:
Best, Average, Worst: O(n²) (no change based on input order)

Key Point: Fewer swaps than Bubble Sort, but same number of comparisons

How well did you know this?
1
Not at all
2
3
4
5
Perfectly