Sample Final Flashcards
Open hashing allows overflow into lists.
True.
A hash function is better if it produces few collisions.
True.
Hash functions cannot produce the same location for different keys.
False.
A hash function is better if each possible index is equally likely to occur given random keys.
True.
A process known as folding bits can reduce the impact of structure (a lack of randomness) on individual bits.
True.
Pseudo-Random probing generates random numbers during probing to avoid collisions.
False.
Using the modulo operator (%) with even numbers is a way to spread out locations independently of array size.
False.
Hashing grows more efficient as the load factor increases, encouraging small hash arrays.
False.
Using buckets with a size that is a prime number reduces probing collisions in open hashing.
False.
Closed hashing implies perfect hashing, as there is no place for duplicate keys.
False.
A hash function is used to convert a key into a location, or index.
True.
Hashing can be used to provide fast access to data in large datasets.
True.
Quadratic probing avoids clustering in many cases, but generally, not all hash table slots can be accessed during probing.
True.
Perfect hashes ensure that there can be no collisions.
True.
Open hashing allows overflow into lists, removing the need for probing.
False.
Secondary clustering happens when different keys hash to the same spot and have the same probing sequence; this cannot be fixed by conventional probing.
True.
Closed hashing stores values that hash to the same space into buckets, then performs probing when those buckets are full.
True.
Probing functions generally ignore the key.
True.
Probing functions are not hashing functions.
True.
A hash function is better if it is cheap.
True.
Hashing the key twice is called double hashing and it limits the need to perform probing.
False.
Hash functions can produce different locations for the same key.
False.
Finding an element in a heap takes theta(n) operations.
True.
A heap in an array will in general always be full.
False.
The height of a heap in an array is known if we know how many elements are in it.
True.
A heap is a tree-based data structure.
True.
Building a heap from an array can be done in theta(n) operations.
True.
Finding an element in a heap takes theta(log n) operations.
False.
A heap can only be implemented using a linked list.
False.
The number of leaves in a non-empty full binary tree is one more than the number of internal nodes.
True.
If you add one child to any node that has only one child, and two children to every leaf node, you have added n + 1 nodes.
True.
Preorder is a traversal of a binary tree where the left child is processed before the parent node, and the right child is processed last.
False.
If you add one child to any node that has only one child, and two children to every leaf node, you have added n nodes.
False.
Binary trees must be traversed depth-first.
False.
Postorder is a traversal of a binary tree where children are processed before the parent node is processed.
True.
Preorder is a traversal of a binary tree where children are processed after the parent node is processed.
True.
Heap sort is always faster than merge sort for sorting arrays of integers.
False.
Heapsort has a space complexity of theta(n).
False.
Bottom up heapsort has a better asymptotic complexity than top down heapsort.
False.
Heap sort can be implemented in-place, meaning that it does not require extra memory allocation for its operation.
True.
Heapsort is faster than quicksort when finding the first 10 values of a large input.
True.
Heap sort has a worst case time complexity of theta(n log n), the same as its average case complexity.
True.
Prim’s algorithm guarantees a minimum cost.
True.
MSTs are free trees with |V| - 1 edges.
True.
Kruskal’s performs quickly because processing the heap of edges is theta(E log E), but the operational cost c is lower than that used in Prim’s algorithm.
False.
MSTs can have cycles if the weights on the edges are not distinct.
False.
Given a connected, undirected, weighted graph, the MST of that graph is still connected and has the minimum total cost when you measure that cost by summing all the weights on the subset of edges kept from the original graph.
True.
Kruskal’s algorithm is generally more efficient than Prim’s algorithm, but cannot guarantee a minimum cost in some cases.
False.
(Quicksort) When available, tail recursion eliminates stack overflow caused by pathological pivot selection.
True.
A pathological case for Quicksort when always selecting the rightmost value as the pivot would be to sort already sorted data.
True.
(Quicksort) Diverting to heap sort after a certain depth can prevent theta(n^2) complexity in the worst case.
True.
(Quicksort) When available, tail recursion can prevent theta(n^2) complexity in the worst case.
False.
With proper pivot selection, quicksort performs in theta(n) in the best case of inputs.
False.
Quicksort always performs more efficiently than merge sort.
False.
(Dictionary) Search keys are comparable.
True.
Asymptotic complexity of Dictionary operations depends on the underlying storage mechanism.
True.
Dictionaries cannot be used with multi-dimensional arrays, like points (e.g. x,y coordinates).
False.
Removing a record with an unsorted array-based list as the underlying storage mechanism is of complexity theta(n).
True.
All possible keys are considered when comparing two records for placement.
False.
Finding a record with a sorted array-based list as the underlying storage mechanism is of complexity theta(n).
False.
A max heap, where the key is the priority, is the optimal form of heap for a priority queue.
True.
Adding/removing nodes and retrieving the highest priority element all have the same complexity for a heap.
True.
theta(log n) is a realistic complexity to find the next item, for a well written priority queue.
True.
theta(n log n) is a realistic complexity to find the next item, for a well written priority queue.
False.
theta(n) is a realistic complexity to find the next item, for a well written priority queue.
False.
A min heap, where the key is the priority, is the optimal form of heap for a priority queue.
False.
A heap is the standard BST used in priority queues.
False.
(Heap sort) Heapifying, or putting the input into its initial heap state is in the class theta(n) in all cases.
True.
Starting from the end of the array where a heap is stored, it takes theta(n) operations to find the first non-leaf node.
False.
Finding an element in a heap takes theta(n) operations.
True.
A heap in an array will in general always be complete.
True.
A heap is a linear data structure.
False.
A heap is useful for finding the median element in constant time.
False.
In the average case, there are 2 and 2/3 comparisons - when sorting 3 numbers.
True.
Space efficiency in the average case is theta(n^2) - when sorting 3 numbers.
False.
In the best case, there are only 2 comparisons - when sorting 3 numbers.
True.
In the average case, there are only 2 comparisons - when sorting 3 numbers.
False.
In all cases the algorithm is stable - when sorting 3 numbers.
False.
In the worst case, there are only 3 comparisons - when sorting 3 numbers.
True.
(Heap) Given the index of a node i, the right Child can be found using 2*(i-1).
False.
(Heap) Given the index of a node i, the parent of a non-root node can be found using ((i+1)»1) - 1.
True.
(Heap) Given the index of a node i, the left Child can be found using 2*(i + 1) - 1.
True.
(Heap) Given the index of a node i, the right Child can be found using 2*(i + 1).
True.
(Heap) Given the index of node i, the parent of a non root node can be found using (i-1)/2.
False.
(Heap) Given the index of a node i, the left child can be found using 2*(i+1).
False.