LN6 Problems Flashcards

Median finding, Sorting

1
Q

What is the general form of the recurrence relation for analyzing divide-and-conquer algorithms?

A

T(n) = aT(n/b) + cn^k, where “a” is the number of subproblems, “b” is the factor by which the problem size is reduced, and “cn^k” represents the time complexity of the “conquer” phase.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the three cases in the Master Theorem for analyzing recurrences?

A
  1. a > b^k, leading to T(n) = O(n^log_b(a))
  2. a = b^k, leading to T(n) = O(n^k * log(n))
  3. a < b^k, leading to T(n) = O(n^k).
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How does the Master Theorem determine the time complexity based on “a” and “b^k”?

A

By comparing “a” to “b^k”:
If “a” is greater, the recurrence grows according to “a”.
If “a” equals “b^k”, logarithmic growth is introduced.
If “a” is less than “b^k”, the polynomial term cn^k dominates.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

In sorting, what is the time complexity of Bubble Sort, and why is it inefficient for large datasets?

A

Bubble Sort has a time complexity of O(n^2) because each element must be swapped iteratively through each position, making it slow when many elements are out of order.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Why does Insertion Sort achieve O(n^2) complexity despite O(n log n) comparisons?

A

Because each insertion may require shifting elements, costing O(n) time per insertion on average, leading to an overall time complexity of O(n^2) for sorting.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Describe how Mergesort works and its time complexity.

A

Mergesort divides the array into two halves, recursively sorts them, and merges the sorted halves. It has a time complexity of O(n log n) due to the T(n) = 2T(n/2) + O(n) recurrence relation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the primary disadvantage of Mergesort?

A

Mergesort requires additional memory for merging phases, which can be inefficient regarding space complexity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How does Quicksort achieve O(n log n) on average?

A

By using a pivot to partition elements and recursively sorting the partitions, where each partitioning step is O(n), and with a balanced pivot, recursion results in O(n log n).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the worst-case time complexity of Quicksort, and how can it occur?

A

The worst-case time complexity is O(n^2), which occurs when the pivot is consistently the smallest or largest element, leading to unbalanced partitions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Why is choosing the median of three random elements beneficial in Quicksort?

A

It increases the likelihood of a balanced partition, reducing the chances of the worst-case O(n^2) time complexity and improving average performance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What problem does the Selection Algorithm solve, and what is its optimal time complexity?

A

The Selection Algorithm finds the kth smallest element in an unordered set, with an optimal time complexity of O(n) in the average case using random pivots.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why is the Selection problem less complex than Sorting?

A

Selection requires finding a single element rather than ordering all elements, which allows for O(n) time complexity without full sorting.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How does the Master Theorem apply to the Mergesort recurrence T(n) = 2T(n/2) + O(n)?

A

With a = 2, b = 2, and k = 1, we get a = b^k, leading to T(n) = O(n log n), which is the time complexity of Mergesort.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Describe the “Counting Inversions” problem.

A

Given a sequence, an inversion is a pair where i < j but a_i > a_j. Counting inversions measures the sequence’s “unsortedness” or similarity to another sequence.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the significance of log(n!) in the context of sorting lower bounds?

A

log(n!) represents the minimum number of comparisons required to sort “n” elements, resulting in a lower bound of O(n log n) for comparison-based sorting algorithms.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Explain the lower bound concept for sorting and searching algorithms.

A

The lower bound is based on information-theoretic arguments; for instance, sorting requires O(n log n) comparisons because each comparison only provides a binary answer.

17
Q

How does Bucketsort achieve O(m + n) time complexity, and in what cases is it optimal?

A

Bucketsort works in O(m + n) when elements are uniformly distributed across a fixed range of size “m,” ideal for datasets with known ranges or uniformly distributed values.

18
Q

What is the optimal time complexity for finding the median in the “Center of a Point Set on the Line” problem?

A

The optimal point minimizing the sum of distances to given points on a line is the median, achievable in O(n) time for selection without full sorting.

19
Q

What are the main differences between in-place and non-in-place algorithms?

A

In-place algorithms (e.g., Quicksort) require only constant extra memory, whereas non-in-place algorithms (e.g., Mergesort) may need additional memory for auxiliary arrays.

20
Q

How does using random pivots in Quicksort help achieve expected O(n log n) complexity?

A

Random pivots reduce the likelihood of extremely unbalanced splits, creating balanced partitions on average and preventing the worst-case O(n^2) complexity.