Algorithms Flashcards
A* computational complexity
Boh
Explain A* algorithm
The working principle is the same as Dijkstra, with the difference that to each node is also associated a heuristic cost. This can be, for example, the straight-line distance from the goal location. When we evaluate the cost to reach a given node k from node i, we compute it as the normal cost in Dijkstra’s algorithm, plus the heuristic cost h(k). In other words, the cost is f(k) = g(k) + h(k), where g(k) is the cost to go from the start to k, and h(k) is the heuristic cost of k.
What is the amortized cost?
Given a sequence of n operations, the amortized cost is defined as Cost(n operations)/n. It provides a sort of average to understand the cost of operations. Think about push_back() in a dynamic array. It can be O(1) if the array does not have reached its capacity, or O(n) if it is full, since we need to allocate a new array and copy all the values over. It is possible to prove that the amortized cost of the push_back() for a dynamic array is O(1), since it results in O(n)/n, which is clearly O(1). The worst case cost is still O(n), but on average it is O(1).
What is the computational complexity of insertion sort?
It has an average and worst-case performance O(N2), but sometimes can be O(N).
What is the computational complexity of bubble sort?
It is pretty slow, since it has a runtime complexity of O(N2). The main advantage is that we don’t need any extra memory.
How does quick sort work?
Quick sort is a very efficient sorting algorithm. It works as follows:
- Select an element in the array and consider it as pivot.
- Split the array into two parts, one on the left of the pivot, one on the right.
- Check the farmost element in the left half with the farmost element in the right half and, if the first is larger than the pivot and the second is smaller than the pivot, swap them.
- Move on with the second element forward in the left half, backward in the left half, compare them and if necessary swap them.
- Repeat this until the right pointer passes the pivot.
- Run quick sort on each of the two subset defined by the pivot.
What are the four computational complexities?
- P: set of problems solvable in polynomial time;
- NP: set of problems solvable in polynomial time with a “lucky” algorithm (making guesses and solving the problem with the first guess). It is a nondeterministic model: an algorithm makes guesses that are guaranteed to lead to a “yes” answer, if possible. An equivalent definition is the set of problems with solutions that can be checked in polynomial time.
- EXP: set of problems solvable in exponential time;
- R: set of problems solvable in finite time;
What is the computational complexity of merge sort?
It is very efficient in terms of computational complexity, which is O(N log N). However, it comes at the cost of higher memory consumption, since to merge the arrays it is necessary to allocate new memory. It has O(N) memory occupation.
What is the complexity of a recursive function?
The time complexity of a recursive function depends on the number of times the function is called and the time complexity of a single call. The total time complexity is the product of these values.
Example
The call f(n) causes n function calls, and the time complexity of each call is O(1). Thus, the total time complexity is O(n).
Explain Dijkstra’s algorithm
Assume you want to compute the minimum-cost path from a start node S to a goal node G.
Initialization: give the node S a cost of zero, while associate infinite cost to all other nodes.
Start from S, pick all its children and for each one replace their cost with the transition cost from S to each of them. Add the children to a priority queue, ordered by the node’s cost. Also, record that the previous node for each of them was S.
Mark S as visited, and pick the node in the priority queue with the lowest cost (top). Let’s call it N.
For each children of N, compute its new potential cost as the cost to go to N (i.e., the cost of N) plus the cost to transition from N to it. If this sum is lower than the current cost associated to the node, then replace their cost with the new one and record that their previous node was N.
Repeat steps 3 and 4 until there are no other nodes to visit or the node G is at the top of the priority queue (if it is on top, it means that any other path would have a higher cost).
What is the time complexity of two nested loops going one from 1 to N and the other from 1 to M?
O(M*N)
How does merge selection work?
Selection sort works in a similar way to insertion sort. We go through the unsorted array, look for the smallest element, and mvoe it in the next slot of the new ordered array. Then, we look for the smallest element of the unordered subarray (the unsorted array after removing the previous minimum), move it to the sorted array and so on.
What’s the main drawback of Dijkstra’s algorithm?
That it can drive the search very far away from the goal node, since it only considers “local” information.
What is the computational complexity of quick sort?
Depending on the choice of the pivot, the runtime complexity can be
- Best-case: O(N log N)
- Worst-case: O(N2)
What are the main complexity classes?
- Constant O(1)
- Logarithmic O(Log N)
- Square root O(sqrt(N))
- Linear O(N)
- Linear+log O(N*Log N)
- Quadratic O(N^2)
- Exponential O(2^N)
- Factorial O(N!)