Algorithms Flashcards

1
Q

A* computational complexity

A

Boh

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Explain A* algorithm

A

The working principle is the same as Dijkstra, with the difference that to each node is also associated a heuristic cost. This can be, for example, the straight-line distance from the goal location. When we evaluate the cost to reach a given node k from node i, we compute it as the normal cost in Dijkstra’s algorithm, plus the heuristic cost h(k). In other words, the cost is f(k) = g(k) + h(k), where g(k) is the cost to go from the start to k, and h(k) is the heuristic cost of k.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the amortized cost?

A

Given a sequence of n operations, the amortized cost is defined as Cost(n operations)/n. It provides a sort of average to understand the cost of operations. Think about push_back() in a dynamic array. It can be O(1) if the array does not have reached its capacity, or O(n) if it is full, since we need to allocate a new array and copy all the values over. It is possible to prove that the amortized cost of the push_back() for a dynamic array is O(1), since it results in O(n)/n, which is clearly O(1). The worst case cost is still O(n), but on average it is O(1).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the computational complexity of insertion sort?

A

It has an average and worst-case performance O(N2), but sometimes can be O(N).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the computational complexity of bubble sort?

A

It is pretty slow, since it has a runtime complexity of O(N2). The main advantage is that we don’t need any extra memory.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How does quick sort work?

A

Quick sort is a very efficient sorting algorithm. It works as follows:

  • Select an element in the array and consider it as pivot.
  • Split the array into two parts, one on the left of the pivot, one on the right.
  • Check the farmost element in the left half with the farmost element in the right half and, if the first is larger than the pivot and the second is smaller than the pivot, swap them.
  • Move on with the second element forward in the left half, backward in the left half, compare them and if necessary swap them.
  • Repeat this until the right pointer passes the pivot.
  • Run quick sort on each of the two subset defined by the pivot.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are the four computational complexities?

A
  • P: set of problems solvable in polynomial time;
  • NP: set of problems solvable in polynomial time with a “lucky” algorithm (making guesses and solving the problem with the first guess). It is a nondeterministic model: an algorithm makes guesses that are guaranteed to lead to a “yes” answer, if possible. An equivalent definition is the set of problems with solutions that can be checked in polynomial time.
  • EXP: set of problems solvable in exponential time;
  • R: set of problems solvable in finite time;
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the computational complexity of merge sort?

A

It is very efficient in terms of computational complexity, which is O(N log N). However, it comes at the cost of higher memory consumption, since to merge the arrays it is necessary to allocate new memory. It has O(N) memory occupation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the complexity of a recursive function?

A

The time complexity of a recursive function depends on the number of times the function is called and the time complexity of a single call. The total time complexity is the product of these values.

Example
The call f(n) causes n function calls, and the time complexity of each call is O(1). Thus, the total time complexity is O(n).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Explain Dijkstra’s algorithm

A

Assume you want to compute the minimum-cost path from a start node S to a goal node G.
Initialization: give the node S a cost of zero, while associate infinite cost to all other nodes.
Start from S, pick all its children and for each one replace their cost with the transition cost from S to each of them. Add the children to a priority queue, ordered by the node’s cost. Also, record that the previous node for each of them was S.
Mark S as visited, and pick the node in the priority queue with the lowest cost (top). Let’s call it N.
For each children of N, compute its new potential cost as the cost to go to N (i.e., the cost of N) plus the cost to transition from N to it. If this sum is lower than the current cost associated to the node, then replace their cost with the new one and record that their previous node was N.
Repeat steps 3 and 4 until there are no other nodes to visit or the node G is at the top of the priority queue (if it is on top, it means that any other path would have a higher cost).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the time complexity of two nested loops going one from 1 to N and the other from 1 to M?

A

O(M*N)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How does merge selection work?

A

Selection sort works in a similar way to insertion sort. We go through the unsorted array, look for the smallest element, and mvoe it in the next slot of the new ordered array. Then, we look for the smallest element of the unordered subarray (the unsorted array after removing the previous minimum), move it to the sorted array and so on.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What’s the main drawback of Dijkstra’s algorithm?

A

That it can drive the search very far away from the goal node, since it only considers “local” information.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the computational complexity of quick sort?

A

Depending on the choice of the pivot, the runtime complexity can be

  • Best-case: O(N log N)
  • Worst-case: O(N2)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What are the main complexity classes?

A
  • Constant O(1)
  • Logarithmic O(Log N)
  • Square root O(sqrt(N))
  • Linear O(N)
  • Linear+log O(N*Log N)
  • Quadratic O(N^2)
  • Exponential O(2^N)
  • Factorial O(N!)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Depth-first search: what it is and how you would implement it

A

Implementation: stack of nodes, hash table of nodes visited

17
Q

What is the computational complexity of selection sort?

A

It always have O(N2) time complexity, but has O(N) space complexity.

18
Q

What’s Dijkstra’s complexity?

A

O(|E| + |V| log V)

19
Q

How does binary search work?

A

Binary search is an algorithm to search elements in a sorted container. To working principle is pretty simple:

  • Compare the item you are looking for with the mid element of the container;
  • If you found it, done;
  • If you did not find it, take half of the original container (either left or right half) according to the following rule:
  • If the element you are looking for is larger than the mid element, take the right half;
  • If the element you are looking for is smaller than the mid element, take the left half
  • Repeat until you find the element or you end up with a container having only two elements. If the element is neither these two, then the element is not present in the container.
20
Q

How does bubble sort work?

A

Bubble sort is a very naive sorting algorithm. The idea is to walk through the array and, whenever we find two consecutive items out of order, we swap them. If we do this enough times, eventually the items will be sorted.

21
Q

What is the usual complexity of algorithms based on dynamic programming?

A

DP solutions have a polynomial complexity which assures a much faster running time than other techniques like backtracking, brute-force etc.

22
Q

What is logarithmic complexity?

A

Sometimes algorithms have logarithmic complexity. It might sound weird, but the explanation is pretty simple. Every time an algorithm working on some data at each iteration halfs the amount of the data it has to work on (e.g., binary search, where every time we chose one of the two halfs), it has most likely logarithmic complexity O(log n).

23
Q

Is A* complete? Is A* admissible?

A

A* is complete and will always find, if it exists, the minimum-cost path to the goal.

It is admissible if the heuristic is admissible. This means that, given two nodes x and y, h(y) <= d(x,y) + h(x), where d(x,y) is the length of the edge connecting them. In other words, it is impossible to decrease (total distance so far + estimated remaining distance) by extending a path to include a neighboring node.

24
Q

If my algorithm has several sequential phases, what should I consider to estimate the runtime complexity?

A

If the algorithm consists of consecutive phases, the total time complexity is the largest time complexity of a single phase. The reason for this is that the slowest phase is usually the bottleneck of the code.

25
Q

How does insertion sort work?

A

Insertion sort is a simple sorting algorithm which works by running through all the element sof an array and checking for each of them their best position in the array. For example, we can start with the second element, check if it is smaller than the first and, if this is the case, place it before the first. Then we go with the third element and compare it with the first and the second, placing where it fits best. The fourth element is then checked with the first three, and so on.

26
Q

Which data structures do you need to implement Dijkstra’s algorithm?

A
  • Priority queue for nodes to be visited
  • Array of previous nodes (to go back from goal to start)
  • Maybe something else, I doubt it, to be checked
27
Q

What is the asympthotic complexity?

A

The time complexity of an algorithm estimates how much time the algorithm will use for some input. The time complexity of an algorithm is denoted O(···) where the three dots represent some function. Usually, the variable n denotes the input size. For example, if the input is an array of numbers, n will be the size of the array, and if the input is a string, n will be the length of the string.

28
Q

What is the computational complexity of binary search?

A

In a search-space of dimension N, binary search has runtime complexity log(N).

29
Q

How can I implement binary search?

A

Binary search can be implemented either recursively or iteratively.

30
Q

How does merge sort work?

A

Merge sort if a very efficient sorting algorithm. It works as follows:

  • Split the array into a left and a right half;
  • Sort the left and the right halves using merge sort, then merge them in a sorted order. Clearly, merge sort is recursive.
31
Q

Breadth-first search: what it is and how you would implement it

A

Implementation: queue of nodes to visit, no need of other data structure to check visited nodes since removing from queue already guarantees avoiding repeated checks (and infinite loops)