Quiz 3 Flashcards
What are the two main components of a graph data structure?
Vertices (V) and edges (E)
How is a sparse graph defined compared to a dense graph?
A sparse graph has E = O(V) edges while a dense graph has E = O(V^2) edges.
What is the runtime complexity of Kruskal’s algorithm to find the minimum spanning tree?
O(E log E) or O(E log V)
What is Prim’s algorithm and how does it differ from Kruskal’s algorithm?
Prim’s grows a MST starting from a single root vertex, adding minimum weight edges. Kruskal’s builds up a MST by adding edges in increasing weight order without cycles.
What is the greedy method and what are its pros and cons?
It optimizes locally without regard for long-term consequences. Pros: simple, efficient. Cons: often suboptimal.
When can greedy algorithms lead to optimal solutions?
For problems exhibiting optimal substructure like shortest paths, minimum spanning trees.
What is the optimal coin change problem an example of?
A greedy algorithm for optimization.
What is the goal of scheduling problems like minimizing time in system?
To schedule jobs/tasks to optimize an objective like average completion time.
What is the 0/1 knapsack problem?
Packing the most valuable set of items without exceeding a weight capacity.
How do you select objects for the knapsack problem using the greedy method?
In decreasing order of value per unit weight.
What is the purpose of backtracking and branch and bound algorithms?
To search a solution space represented as a tree.
How do backtracking and branch and bound differ in traversing a state space tree?
Backtracking uses depth-first search while branch and bound uses breadth-first search.
What is the optimal substructure property needed for dynamic programming?
Optimal solutions are built from optimal solutions to subproblems.
What is the divide and conquer algorithm strategy?
Break a problem into subproblems, solve subproblems, and combine solutions.
When is divide and conquer most useful?
For problems that can be recursively broken down into smaller instances like sorting, matrix multiplication.
What is the principle of optimality for dynamic programming?
Optimal solutions to a problem contain optimal solutions to subproblems.
What is the specialized principle of optimality?
The optimal sequence of decisions leaves subproblems optimal.
What is the problem with greedy method on the sequential storage problem?
It fails if objects have different access probabilities.
How do you optimally order files for sequential storage?
In increasing order of cost/probability ratio.
What is the goal of scheduling problems like maximizing profit?
To schedule jobs to maximize objectives like profit within constraints.
How do you select jobs using the greedy method for scheduling with deadlines?
Select the most profitable remaining job that meets its deadline.
What is the fractional knapsack problem?
A knapsack problem where fractions of items can be taken.
How do you prove the greedy method optimally solves knapsack problems?
By showing the residual after swapping items is always nonnegative.
What are the different selection methods for the knapsack problem?
Maximum value, minimum weight, maximum value/weight ratio.
What is the runtime complexity to implement knapsack using greedy?
O(n log n) for sorting + O(n) for greedy selection.
What are the main operations for Kruskal’s algorithm?
Sort edges, find connected components, merge components.
What is the A* search algorithm related to?
Branch and bound with a least cost heuristic.
What is alpha-beta pruning in game trees?
Pruning subtrees that cannot improve the minimax value.
What are heuristics and how are they used in backtracking and branch and bound?
Estimates of cost to guide tree search, like with A* algorithm.
What is the complexity class of problems like TSP that dynamic programming can solve?
NP-hard
What are the differences between top down and bottom up dynamic programming?
Top down uses recursion, bottom up iteratively solves subproblems.
What problems exhibit optimal substructure for dynamic programming?
Problems like shortest paths, text editing distance.
What is memoization and how does it help dynamic programming?
Caching results of solved subproblems to avoid re-computation.
What are the differences between greedy and dynamic programming?
Greedy optimizes locally, DP globally by combining subsolutions.
What heuristic does Dijkstra’s algorithm use to find shortest paths?
Minimum distance from source vertex.
What are some examples of problems where a greedy approach finds optimal solutions?
Minimum spanning trees, shortest paths, Huffman coding.
How does parallelism help overcome performance issues with dynamic programming?
Subproblems can be solved concurrently at each stage.
What is the runtime complexity of Floyd-Warshall algorithm for all pairs shortest paths?
O(V^3)