All relevance Flashcards

1
Q

*Asymptotic Notation

A

Mathematical notation used to describe the limiting behavior of a function; includes Big O, Big Theta (Θ), and Big Omega (Ω).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

*Big O Notation

A

Describes the upper bound of an algorithm’s running time, used for worst-case analysis.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

*Big Omega Notation

A

Describes the lower bound of an algorithm’s running time, used for best-case analysis.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

*Big Theta Notation

A

Describes a tight bound on the running time, i.e., both upper and lower bounds.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

*Time Complexity

A

Measure of the amount of time an algorithm takes to complete as a function of the length of the input.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

*Space Complexity

A

Amount of memory space required by an algorithm as a function of input size.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

*Greedy Algorithm

A

Algorithm that makes locally optimal choices at each step with the hope of finding a global optimum.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

*Divide and Conquer

A

Design paradigm that divides a problem into subproblems, solves them recursively, and combines their solutions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

*Dynamic Programming

A

Solving complex problems by breaking them down into simpler subproblems and storing the results.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

*Backtracking

A

Algorithmic technique for solving recursive problems by trying to build a solution incrementally and removing solutions that fail.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Heap

A

A special tree-based data structure that satisfies the heap property. Used for priority queues.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

*Priority Queue

A

Abstract data type where each element has a priority; elements with higher priorities are dequeued before lower ones.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Disjoint Set

A

Data structure that keeps track of a partition of a set into disjoint subsets. Often used in Kruskal’s algorithm.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Union-Find

A

A disjoint-set data structure with operations to find which set an element is in and to unite two sets.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

*Dijkstra’s Algorithm

A

Greedy algorithm that finds the shortest paths from a source vertex to all other vertices in a weighted graph.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

*Bellman-Ford Algorithm

A

Algorithm that computes shortest paths from a source vertex to all vertices in a weighted digraph, handles negative weights.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

*Floyd-Warshall Algorithm

A

Dynamic programming algorithm for finding shortest paths in a weighted graph with positive or negative edge weights.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

*Kruskal’s Algorithm

A

Greedy algorithm for finding a minimum spanning tree in a graph.

19
Q

*Prim’s Algorithm

A

Greedy algorithm that builds the minimum spanning tree by adding the cheapest edge from the tree to a vertex not yet in the tree.

20
Q

*Topological Sort

A

Linear ordering of vertices in a DAG such that for every directed edge uv, vertex u comes before v.

21
Q

*P vs NP

A

P is the class of problems solvable in polynomial time. NP is the class for which solutions can be verified in polynomial time.

22
Q

*NP-Complete

A

Problems that are both in NP and as hard as any problem in NP. If any NP-complete problem is in P, then P = NP.

23
Q

*NP-Hard

A

At least as hard as the hardest problems in NP; not necessarily in NP (no requirement of polynomial-time verifiability).

24
Q

*Reduction

A

Transforming one problem into another to prove hardness.

25
Q

Cook-Levin Theorem

A

Establishes that the Boolean satisfiability problem (SAT) is NP-complete.

26
Q

*Polynomial Time

A

Class of problems that can be solved in time O(n^k) for some constant k.

27
Q

*Exponential Time

A

Problems that require time O(2^n) or more.

28
Q

Logarithmic Time

A

Problems solvable in O(log n) time, typically through divide-and-conquer strategies.

29
Q

*Greedy vs Dynamic Programming

A

Greedy chooses locally optimal solutions; DP solves all subproblems and combines solutions.

30
Q

*Divide & Conquer vs Dynamic Programming

A

Both divide the problem, but DP stores subproblem solutions to avoid recomputation.

31
Q

*P ⊆ NP

A

Every problem solvable in polynomial time can have its solution verified in polynomial time.

32
Q

*NP-Complete ⊆ NP-Hard

A

All NP-Complete problems are NP-Hard, but not all NP-Hard problems are NP-Complete.

33
Q

*Dijkstra vs Bellman-Ford

A

Dijkstra is faster but can’t handle negative weights; Bellman-Ford can handle negative weights but is slower.

34
Q

Why is Dijkstra’s algorithm faster than Bellman-Ford?

A

Dijkstra’s uses a greedy approach with a priority queue, achieving O((V + E) log V), while Bellman-Ford does not use this structure and runs in O(VE).

35
Q

When would you use Bellman-Ford over Dijkstra?

A

When the graph contains negative weight edges; Dijkstra’s algorithm cannot handle those correctly.

36
Q

How does dynamic programming differ from divide and conquer?

A

Both divide problems into subproblems, but dynamic programming stores results to avoid recomputation, while divide and conquer recomputes them.

37
Q

What makes a problem NP-Complete?

A

It must be in NP and as hard as any problem in NP, meaning any NP problem can be reduced to it in polynomial time.

38
Q

Why are greedy algorithms not always optimal?

A

They make locally optimal choices without considering the global structure, which can lead to suboptimal solutions in complex problems.

39
Q

How are Kruskal’s and Prim’s algorithms different in building MSTs?

A

Kruskal’s grows a forest and adds the smallest edge between any two trees, while Prim’s grows a single tree by adding the smallest edge from the tree.

40
Q

How does topological sort relate to DAGs?

A

Topological sort is only defined for Directed Acyclic Graphs (DAGs), providing a linear ordering respecting edge directions.

41
Q

What is the significance of P ⊆ NP?

A

Every problem that can be solved in polynomial time can also have its solution verified in polynomial time.

42
Q

Why is NP-Hard not necessarily in NP?

A

NP-Hard problems might not have solutions that can be verified in polynomial time or even be decision problems.

43
Q

How does the Union-Find structure optimize Kruskal’s algorithm?

A

It efficiently tracks disjoint sets to avoid cycles when adding edges, using union by rank and path compression.