Sorting Flashcards

1
Q

What is a heap?

A
  • an array visualized as a binary tree
  • ordered binary tree
  • an implementation of a priority queue
  • it must be a complete binary tree, which means that we fill up all of the nodes on the level we are on before adding another level

ex:
5
1 | 4
3 2 | 0 -1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is a max heap?

A

All parent elements must be larger than both of their child elements

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does the build-max-heap method do in heaps?

A

builds a max heap from an unsorted array

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What does the maxheapify method do in heaps?

A
  • assumes part of the array is already sorted

- corrects a single violation of the heap property in a tree’s subroot

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the time complexity of maxheapify?

A

lg(n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the time complexity of heapsort?

A

nlgn

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are the steps for heapsort?

A
  1. build a max heap, so that the element with the greatest value is at the top.
  2. switch it with the last element and remove it from the heap.
  3. repeat steps one and two until the heap has only one element remaining.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How do we find the left child of any parent element?

A

multiply the parent element’s index by 2 and add 1.

let left = 2indexOfParent + 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How do we find the right child of any parent element?

A

multiply the parent element’s index by 2 and add 2

let right = 2indexOfParent + 2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are the steps for quick sort?

A
  1. choose a pivot
  2. find itemFromLeft that is larger than pivot
  3. find itemFromRight that is smaller than pivot
  4. swap itemFromLeft with itemFromRight
  5. repeat the process
  6. stop when index of itemFromLeft > index of itemFromRight
  7. swap itemFromLeft with the pivot
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How do you choose a pivot for quick sort?

A

-choose middle element or
- median-of-three approach
look at the first medium and last positions of the array. we sort them in those positions and choose the middle element of the array after those three elements are swapped and placed in those three positions in sorted order

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the time complexity of quick sort?

A

Worst case: n^2
average case: nlogn

this depends on how effectively you choose the pivot

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What advantage does quick sort have over merge sort?

A

it is an in place sorting algorithm which means it does not need any auxiliary data structures

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Is quick sort a divide and conquer algorithm?

A

yes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Implement quicksort

A
function QuickSort(arr, left = 0, right = arr.length - 1) {
  let len = arr.length,
      index

if(len > 1) {

index = partition(arr, left, right)

if(left < index - 1) {
  QuickSort(arr, left, index - 1)
} 

if(index < right) {
  QuickSort(arr, index, right)
}

}

return arr

}

function partition(arr, left, right) {
  let middle = Math.floor((right + left) / 2),
      pivot = arr[middle],
      i = left,                 // Start pointer at the first item in the array
      j = right                 // Start pointer at the last item in the array

while(i <= j) {

    // Move left pointer to the right until the value at the
    // left is greater than the pivot value
    while(arr[i] < pivot) {
      i++
    }
    // Move right pointer to the left until the value at the
    // right is less than the pivot value
    while(arr[j] > pivot) {
      j--
    }
    // If the left pointer is less than or equal to the 
    // right pointer, then swap values
    if(i <= j) {
      [arr[i], arr[j]] = [arr[j], arr[i]]  // ES6 destructuring swap
      i++
      j--
    }
  }

return i

}

// Java program for implementation of QuickSort
class QuickSort
{
/* This function takes last element as pivot,
places the pivot element at its correct
position in sorted array, and places all
smaller (smaller than pivot) to left of
pivot and all greater elements to right
of pivot */
int partition(int arr[], int low, int high)
{
int pivot = arr[high];
int i = (low-1); // index of smaller element
for (int j=low; j Array to be sorted,
low –> Starting index,
high –> Ending index /
void sort(int arr[], int low, int high)
{
if (low < high)
{
/
pi is partitioning index, arr[pi] is
now at right place */
int pi = partition(arr, low, high);

        // Recursively sort elements before 
        // partition and after partition 
        sort(arr, low, pi-1); 
        sort(arr, pi+1, high); 
    } 
} 
    /* A utility function to print array of size n */
    static void printArray(int arr[]) 
    { 
        int n = arr.length; 
        for (int i=0; i
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the runtime of Insertion sort?

A

Average n^2

worst n^2

17
Q

What is the runtime of merge sort?

A

Worst nlgn

average nlgn

18
Q

How much space does merge sort take?

A

O(lg n)

19
Q

How much space does insertion sort take?

A

O(1)

20
Q

How much space does quick sort take?

A

O(lg n)

Since quicksort calls itself on the order of log(n) times (in the average case, worst case number of calls is O(n)), at each recursive call a new stack frame of constant size must be allocated. Hence the O(log(n)) space complexity.

21
Q

When would you use bucket sorting?

A

-When you know the upper and lower bounds of the data.
Bucketing is a very effective idea whenever we are confident that the distribution
of data will be roughly uniform. It is the idea that underlies hash tables, kd-trees,
and a variety of other practical data structures. The downside of such techniques
is that the performance can be terrible when the data distribution is not what we
expected. Although data structures such as balanced binary trees offer guaranteed
worst-case behavior for any input distribution, no such promise exists for heuristic
data structures on unexpected input distributions.

Bucket sort with insertion sort in the buckets is only reasonable when we can expect that there will be only a few items in each bucket. When there are only a few items, insertion sort is fine.

In real life, this doesn’t happen too often. Usually its when we expect that data to be uniformly distributed because we’re sorting into hash order or something like that.

Bucket sort is most commonly used when it’s the entire sort – i.e., the buckets don’t need to be sorted at all and you can just append each item into the bucket list.

the performance of the Bucket Sort degrades with clustering; if many values occur close together, they will all fall into a single bucket and be sorted slowly.

example of when bucket sorting is a benefit

  • say we are using bubble sort on a list of 10
  • we use bucket sort and break the list into 2
  • then we run bubble sort on the 2 lists
  • O(n^2) on each list
  • new runtime = 5^2 + 5^2
  • the 5 comes from each list length then they are combined
  • make 50 potential comparisons instead of 100 potential comparisons
22
Q

Describe bucket sort

A

The process of bucket sort can be understood as a scatter-gather approach. The elements are first scattered into buckets then the elements of buckets are sorted. Finally, the elements are gathered in order.

Bucket sort breaks down a big list into smaller lists to be sorted by another algorithm and then those lists get combined into one large sorted list.

example:
{1, 2, 4, 3, 8, 7, 5, 6}

{1,2,4,3}

{8, 7, 5, 6}

use insertion sort on the buckets to sort them
combine the two buckets into one list

23
Q

What is the runtime of bucket sort?

A

worst case: O(n^2)
average case: O(n + k)
k is the number of buckets created

24
Q

What is the runtime of bubble sort?

A

worst: O(n^2)
average: O(n^2)

25
Q

What is the space complexity of bubble sort?

A

O(1)

26
Q

How do you implement bubble sort?

A
class Main {
  int[] arr = {5,1,7,4};
  public static void main(String[] args) {
    System.out.println("Hello world!");
    Main m = new Main();
    m.bubbleSort();
  }
  public void bubbleSort() {
    for(int i = 1; i < arr.length; i++) {
      for(int j = 0; j < arr.length-1;j++){
        if(arr[j] > arr[j+1]){
          int temp = arr[j];
          arr[j] = arr[j +1];
          arr[j+1] = temp;
        }
      }
    }
  }
}
27
Q

What is the runtime of selection sort?

A

worst: O(n^2)
average: O(n^2)

28
Q

How do you implement selection sort?

A
// Java program for implementation of Selection Sort 
class SelectionSort 
{ 
    void sort(int arr[]) 
    { 
        int n = arr.length; 
        // One by one move boundary of unsorted subarray 
        for (int i = 0; i < n-1; i++) 
        { 
            // Find the minimum element in unsorted array 
            int min_idx = i; 
            for (int j = i+1; j < n; j++) 
                if (arr[j] < arr[min_idx]) 
                    min_idx = j; 
            // Swap the found minimum element with the first 
            // element 
            int temp = arr[min_idx]; 
            arr[min_idx] = arr[i]; 
            arr[i] = temp; 
        } 
    } 
    // Prints the array 
    void printArray(int arr[]) 
    { 
        int n = arr.length; 
        for (int i=0; i
29
Q

Explain the steps of selection sort

A
  1. Starting from the first element, we search the smallest element in the array, and replace it with the element in the first position.
  2. We then move on to the second position, and look for smallest element present in the subarray, starting from index 1, till the last index.
  3. We replace the element at the second position in the original array, or we can say at the first position in the subarray, with the second smallest element.
  4. This is repeated, until the array is completely sorted.
30
Q

What is the space complexity of selection sort?

A

O(1)

31
Q

For merge sort, at each level j=0,1,2…,log2n, there are subproblems of size of work

A

2^j, n/2^j

why?
the recursion tree has 2 nodes per each previous node (2^j) and the amount of work done at each level is n/2^j

as opposed to doing the sorting in n time where the work to be done would be linear