Sorting Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

Describe the Heap Sort

A

The input of the heap sort is represented by a list of values.
The heap is represented as an (incomplete) binary tree. In a max heap, the root contains the largest value, each child value is smaller or equal to its parent node.

The heap sort consists of 3 functionalities: build_heap, heapify & heap_sort.

  1. Transform the input into a valid max heap. Values n/2 … n represent leaf nodes, so the algorithm starts at node n/2 -1. On each node from n/2 -1 up to the root node (n=0), the sub tree is transformed so that the heap constraint is fulfilled. Each node is exchanged with the largest of its 2 child nodes, if its value is smaller than the largest child node’s one (heapify)
  2. After the nodes have been exchanged, the heap constraint may be violated on the child subtree, so we recursively call the heapify function on the subtree, as long as we transformed the tree.
  3. Once the tree is heapified, we know that the largest value is located at the root node. We exchange the root node with the last leaf node and heapify the tree starting from the new root node to fulfill the heap constraint. The largest value (last leaf node) is now excluded from the heapify. This is repeated until the whole tree is transformed.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the advantage of the Heap Sort compared to Insertion Sort and Merge Sort?

A

In difference to insertion sort, the complexity is O (n log n) (vs. O(n²))
In difference to merge sort, the sort procedure is inplace (only a constant number of elements are stored outside the input array at any time)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Time complexity of heap sort

A

Worst case: O( n log n )
Average case: O( n log n )
Best case: O( n log n ) (disctinct keys) or O( n ) (equal keys)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Name an exemplary application of a max heap other than heap sort

A

Priority queues. The can maintain a structure of objects with assigned priority for efficient retrieval and management.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Why is quicksort often a good choice despite its bad worst case running time?

A

While the worst case running time is O ( n² ), the average running time is O ( n log n ) and the constant factors hidden therein are quite small

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

On which conditions will the quicksort algorithm run in worst case time complexity?

A

On unbalanced partitions (e.g. 1st partition contains n-1 items, 2nd partition contains 0 items).
This would be the case on already sorted inputs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Describe the quicksort algorithm

A

The quicksort algorithm is a recursive divide-and-conquer algorithm to efficiently sort inputs.
It consists of 2 parts: Partitioning & recursively calling the quicksort on the both sub-partitions.

Partitioning: The whole partition is divided into 3 parts: Part 1 contains values <= the pivot value, part 2 contains all values > the pivot value and part 3 is the pivot element, which can be the last value of the whole partition. Looping through all values in the whole partition, each value is compared to the pivot element. If it is smaller or equal to the pivot element, it is exchanged with the first value of the second partition. Otherwise the value stays in place. As last operation, the pivot element is exchanged with the first value of the second partition as well, so that the whole partition is separated in 2 consecutive parts: Values <= the pivot element and values > the pivot element.

Recursive quicksort: Now the quicksort is applied separately again to the first partition only and the second partition only, excluding the pivot value used for the previously applied partitioning.

This is repeated until all subpartitions are sorted.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Time complexity of quicksort

A

Worst case: O( n² )
Average case: O( n log n )
Best case: O( n log n )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How can the quicksort behave more efficient on average in comparison to the default procedure of always choosing the last element of the partition as pivot element?

A

By choosing a random element from the partition as pivot element (random sampling), we expect the split of the input array to be reasonably well balanced on average, yielding better behaviour.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Describe the counting sort

A

Counting sort is not a comparison sort like quick-, merge- or heapsort, therefore the lower bounds of O ( n log n ) does not apply here.

Precondition for counting sort is that the input consists only of positive integer values.

In counting sort 3 different arrays are maintained: The input array of unsorted values (A), a temporary working storage (C) and the sorted output array (B).

First C is initialized as empty array (filled with zeros) with length(C) = max(A). The number of elements in C equals the largest number in A.

Then for each value A(i) in A, the index of C according to A(i) is increased by one. Now the value C(i) represents the frequency of the value i in input array A. (e.g. C(0) = 2 means there are 2 elements with value 1 in A).

Next an accumulated sum over C is calculated: C(i) = C(i) + C(i-1). Value C(i) now represents the number of values lower or equal to i in A.

Finally the input A is iterated in reverse and B is filled by: B(C(A(j))) = A(j). Additionally the counting value C(A(j)) is decreased by 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Time complexity of counting sort

A

k => largest value in the input

Best case: O( n ) (if k=1)
Average case: O ( n + k )
Worst case: O ( n + k )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Under which conditions does counting sort perform good?

A

In situations in which the variation in keys (k) is not significantly greater than the number of input items (n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Describe the radix sort

A

Radix sort is like the counting sort an algorithm not based on comparison sort.

Radix sort uses an iterative approach to sort the values on the least significant digit first. The iteration is repeated for all digits until the first digit is reached. On 4-digit numbers this would require 4 iterations.

The counting sort algorithm can be used as a subroutine to sort on each digit.

In order for radix sort to work correctly, the digit sorts must be stable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Time complexity of radix sort

A

O ( w * n ) (w is the key length (number of digits))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly