Coding Interview Key Patterns Flashcards
Linked Lists
List manipulation
- Iterate through the list
- Recurse through the list
Slow / fast pointers
Use two pointers
Two Pointers
Head and tail pointers move toward the middle.
Slow and fast pointers.
Sliding Window.
Binary Search
Collection must be sorted.
Can find in O(log n) time.
Avoid integer overflow:
int mid = left + ((right - left) / 2);
Can be implemented recursively or iteratively
Binary Search Tree
A special characteristic of BST is that its in order traversal yields a sorted array. Note that the definitions of BST might be different for different problems. Moreover, you can recover a binary search tree from its preorder traversal.
Stack and Queue
The main characteristics of stack and queue are LIFO (Last In First Out) and FIFO (First In First Out) respectively.
Stack and queue are frequently used in tree and graph traversal problems, leveraging stack for DFS and queue for BFS.
Hash table and set
Hash table/set is one of the most frequently asked data structures in coding interviews. When stuck, try hash table. Hash table and hash set are also very useful to make time-space trade off. They enable O(1) time of searching.
Tree
Most of tree problems are traversal problems. Traversal algorithms such as DFS including ‘pre-order’, ‘in-order’, and ‘post-order’, and BFS or level order traversal are extremely important.
Generally speaking, the time complexity of DFS algorithm is O(n) where n is the number of nodes and the space complexity is O(h) where h is the height of the tree due to implicit stack used in recursion.
Additionally, BFS problems generally have complexity of O(n) in time and O(w) in space, where w is the maximum width of a tree. Moreover, it is very important to know concepts such as ‘depth’, ‘height’, ‘level’, ‘complete tree’, and ‘perfect tree’, etc.
Graph
Graph problems are generalization of tree problems.
DFS and BFS traversals are extremely important algorithms in graph problems.
One key difference between graph and tree problems is that graph traversal requires a visited variable to memorize nodes that have been visited before.
Moreover, use DFS to solve connected problems and BFS to solve shortest path problems.
The complexity of graph traversals is generally O(V+E) in time and O(V) in space, where V is the number of vertices/nodes and E is the number of edges.
Divide and Conquer
Divide and conquer approach refers to decomposing a big problem into smaller problems, then combine the results obtained from these smaller problems which are solved recursively.
When stuck, think about the base case (the smallest problem), and see if we can build up a solution recursively from there.
Merge sort and quick sort algorithms both leverage divide and conquer approach.
- Divide. Divide the problem into a set of subproblems:
- Conquer. Solve each subproblem recursively.
- Combine. Combine the results of each subproblem.
Greedy Algorithm
Greedy algorithm is a simple and heuristic algorithm that follows the problem-solving heuristic of making the locally optimal choice which might but not necessarily end up a global optimal solution. It builds up a solution piece by piece, always choosing the next piece that offers the most obvious and immediate benefit. Sometimes, we solved a problem using our instinct without even realizing it is a greedy algorithm problem.
The difference between greedy algorithm and dynamic programming (DP) is subtle. Detailed comparison is highlighted in this article. In general, greedy algorithm is simpler and myopic, and make heuristic decision based on locally optimal choice. DP is generally harder, and makes decision at each step considering current problem and solution to previously solved sub-problems.
In practice, a problem might be able to solved in both ways, as pointed out in this article “beneath every greedy algorithm, there is almost always a more cumbersome dynamic programming solution”.
Backtracking
Backtracking is an exhaustive search algorithm for solving combination, permutation, subset, and constraint satisfaction problems. It solves a problem recursively by trying to build a solution incrementally while removing solutions that fail to satisfy the constraints of the problem. The time complexity of backtracking algorithm is generally large, e.g., at least C(n, k) for combination problem and at least P(n, k) for permutation problem reference. This video gives a great tutorial on how to implement backtracking algorithm for combination and permutation problems.
There are three key pillars in backtracking algorithms: (1) the base case, exit condition or the goal to achieve, (2) the choices to make to go to the next building step, and (3) pruning that discards solutions that fail to satisfy the constraints of the problem.
def backtrack(candidate): if find_solution(candidate): output(candidate) return
# iterate all possible candidates. for next_candidate in list_of_candidates: if is_valid(next_candidate): # try this partial candidate solution place(next_candidate) # given the candidate, explore further. backtrack(next_candidate) # backtrack remove(next_candidate)
Dynamic Programming
Dynamic programming (DP) is an algorithm to solve problems by breaking it down into simpler and smaller subproblems and utilizing the fact that the optimal solution to the overall problem depends upon the optimal solutions to these subproblems. DP algorithm is similar to Greedy and Divide & Conquer algorithms in breaking down the problem into smaller subproblems. However, these subproblems are not solved independently, and results of these smaller subproblems are remembered and used for similar and overlapping subproblems.
DP has two forms: top-down (recursion + memoization) and iterative bottom-up. The main goals of DP are reducing time complexity of a problem solved by using recursion from exponential to lower ones such as linear or finding counting and global optimal solution to a problem. The key to solve a DP problem is to find an iterative or recursive formula.
Asking DP problems in coding interviews has been a little bit controversial. It is highly likely that an interviewee either solves the problem in several minutes or gets stuck for the whole coding session. Additionally, a lot of DP problems are really involved and complicated. However, DP is one of the most important algorithms in practice, and proven to be very useful to solve optimal control/decision problems.
Miscellaneous - String
Some of the most famous string problems include anagram, palindrome, substring, and rotation. If you are not familiar with the definition of anagram or palindrome, it might be a good idea to understand these concepts before the interview.
Miscellaneous - Array
Index plays an important role in array problems. In some problems, values in array are used as indices as well. Quick select is a great algorithm to find the k-th smallest element in an array in O(n) time, which outperforms heap/priority queue and sorting algorithms which have time complexities of O(nlog(k)) and O(nlog(n)) respectively.
Miscellaneous - Math
Typical math problems include finding prime numbers, gcd (greatest common divisor), lcm (least common multiple), converting base, and check if a number is a power of 2/3/4, etc.
There are a couple of tips related to gcd and lcm.
find gcd, def gcd(a, b): return a if b == 0 else gcd(b, a%b), time complexity O(log(min(a, b))).
find lcm, def lcm(a, b): return a*b//gcd(a, b).