Hansung Flashcards
What is a heap?
A binary tree with two constraints
- It is complete
- Each child has a value “greater than or equal to” it sparent
How do you add to a heap?
- add the element to the next available space in the tree
- Percolate the value up the tree to maintain the correct ordering
What data structures use heaps?
Priority Queues
How does removeMin work on priority queues?
- Pop the root
- Replace it with the last element in the heap
- Percolate this element down to the bottom of the heap choosing the minimum child
How do we navigate a heap?
- The root of the tree is at array location 0
- The last element is size()-1
- The parent of a node k is at floor((k-1)/2)
- The children of node k are at array locations 2k + 1 and 2k + 2
What is the time complexity of perlocating in a heap?
Ɵ(1)
What is the time complexity of the height of the tree?
Ɵ(log(n))
What is the time complexity of add/removeMin on a heap?
Ɵ(log(n)) in worst case
Ɵ(1) normally
What is the worst case time complexity of Heap sort?
O(n log(n))
What is the space complexity of heap sort?
O(1)
What is a B-tree?
Balanced multiway trees made for fast search, finding successors/predecessors, insert, delete, maximum, minimum
What are M-way trees?
Trees where each non-leaf node has M children
What is the access time of a M-way tree?
Logm(n) = Log2(n)/Log2(m)
What makes B+ trees different to B-Trees
- Only leaf nodes contain data pointers, while internal nodes contain keys only
- Sequential access is possible like a linked list
- Searching is faster
- Insertion and Deletion are easier and faster
What makes B-Trees different to B+ Trees?
- All internal nodes and leaf nodes contain data pointers along with keys
- Sequential access of nodes is not possible
- Searching is slower
- Insertion and Deletion take more time
What values should you use for M in B+ trees?
Odd numbers
What is a Trie?
A multiway tree often used for storing large sets of words
How do tries work?
- They are trees with a possible branch for every letter of an alphabet
- All words end with a special letter “$”
- They compactify the edges in the tree in order to remove: internal and external one-way branching
What are the uses of Tries?
- Way of implementing sets
- Provide quick insertion, deletion and find
- Good for spell checkers, completion, algorithm
What are the disadvantages of Tries?
- Waste large amounts of memory
How are Binary Tries different from Normal Tries?
They speeds up finds but loses the advantage of multiway trees by reducing the depth
Why use Tries?
- Classic memory, time complexity trade-off
- Specialist
- Not too complicated to implement
What is the time complexity of a quick-find find?
O(1)
What is the time complexity of a quick-union find?
O(N)
What is the time complexity of a quick find union?
O(n)
What is the time complexity of a quick-union union?
O(N)
What is the time complexity of a Sequential (Linear) Search in each case?
Worse Case: Θ(n)
Best Case: Θ(1)
Average Case: Θ(n)
What is the time complexity of a binary search?
O(log2(n)).
What is the time complexity of an Insertion Sort?
Average Case:Θ(n^2).
Best Case: Θ(n)
What are the properties of Binary Search?
- Query the middle items
- Worst case number of queries = depth of the tree
- Balanced trees have O(log2(n)) levles
What are the properties of insertion sorts?
- It is stable (We only swap the ordering of two elements if one is strictly less than the other)
- It is in-place
What are the properties of Selection Sort?
- It is in-place
- It is not stable
- Always requires Θ(n^2) comparisons.
What is the time complexity of Selection Sort?
Θ(n^2)
What are the properties of bubble sort?
- In-place
- Stable
- Space complexity of O(1)
What is the time complexity of bubble sort?
O(n^2)
What is a loop invariant?
- A property of a loop that helps us understand why an algorithm is correct
What is a decision tree?
A way to visualise many algorithms
What does each case time complexity of a decision tree tell us?
Worst case: Depth of the deepest leaf
Best case: Depth of shallowest leaf
Average case: Average depth of leaves
What are the properties of Merge Sort?
- Stable
- Not in-place
- Merging sub-arrays is quick
- Worst case time complexity: O(nlog(n))
- Asymptotically optimal
How could you improve merge sort?
If sub-arrays fall below a certain threshold, we switch to insertion sort
What is the time complexity of quicksort?
- Partitioning an array is Θ(n)
- Worst case: O(n^2)
- On average O(nlog(n))
- In practice, much faster 39% more comparisons than merge sort closer to O(n)
How does Knuth/Fisher Yates Shuffle work?
- Loop backwards through the array
- At every iteration, generate a random index in the unshuffled part of the array
- Exchange the item under that index with the the current item
- Time complexity of Θ(n), is in-place
What are the benefits of Multi-pivoting?
It improves performance as there is fewer cache misses
What are the properties of Counting Sort?
- No comparisons
- Stable
- Not in-place
- Time Complexity: Θ(n + k)
- Space Complexity: Θ(n + k)
- Of nis much larger than k, it is linear time sorting algorithm
What are the properties of Radix Sort?
- Θ(n)
- Stable
- Not in-place