Algorithms Flashcards

1
Q

What is an algorithm?

A

A process or set of steps to accomplish a certain task.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Why do I need to know algorithms?

A

It is the foundation for being a successful problem solver and developer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the 5 steps to take to best solve a problem?

A
  1. Understand the problem
  2. Explore concrete examples
  3. Break it down
  4. Solve/Simplify
  5. Look back and refactor
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

When solving a problem, what are the 5 steps/questions that can help to better understand the problem?

A
  1. Can I restate the problem in my own words?
  2. What are the inputs that go into the problem?
  3. What are the outputs that should come from the solution of the problem?
  4. Can the outputs be determined from the inputs of the problem. or do I have enough information to solve the problem?
  5. How should I label the important pieces of data that are part of the problem?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

When solving a problem, what are the 4 steps to take when exploring examples?

A
  1. Start with simple examples 2. Progress with more complex examples 3. Explore examples with empty inputs 4. Explore examples with invalid inputs
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

When solving a problem, how do you break down the problem? And why would you break it down?

A

By explicitly writing out the steps you need to take, in pseudocode or comments. It forces you to think about your approach and the code you are going to write before you write it, and helps you catch any lingering conceptual issues or misunderstandings before you dive in and have to worry about details (e.g. language syntax) as well.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

When solving a problem and getting stuck, what are the 4 steps to take to simplify the problem?

A
  1. Find the core difficulty in what you are trying to do 2. Temporarily ignore that difficulty 3. Write a simplified solution 4. Then incorporate that difficulty back in
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

When solving a problem, what are the 5 ‘look back and refactor’ questions to ask yourself?

A
  1. Can I check the result? 2. Can I understand it at a glance? 3. Can I improve the performance of the solution? 4. Can I derive the result differently? 5. Can I think of other ways to refactor?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

In JavaScript, how to use a regular expression to check if a character is an alphanumeric character?

A

/[a-zA-Z0-9]/.test(character)

Returns true if it finds a match, otherwise it returns false Note: Regular expressions do not have the best performance; checking ASCI character code value (using string.charCodeAt(index)) could be about 50% faster.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Name 7 of the most commonly used problem solving ideas or patterns.

A
  1. Frequency Counter
  2. Multiple Pointers
  3. Sliding Window
  4. Divide and Conquer
  5. Dynamic Programming
  6. Greedy Algorithms
  7. Backtracking
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the idea behind Frequency Counters,, and how do they improve performance?

A

The Frequency Counters pattern uses an object or set to collect values/frequencies of values.

They often avoids the need for nested loops or O(n^2) operations with arrays/strings.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the idea behind the Multiple Pointers pattern, and how do they improve performance?

A

The Multiple Pointers pattern creates values (pointers) that correspond to an index or position, and move towards the beginning, end or middle based on a certain condition.

This pattern is very efficient for solving problems with minimal space complexity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the idea behind the Sliding Window pattern, and what is it useful for?

A

The Sliding Window pattern involves creating a window which can eiter be an array or number from one position to another. Depending on a certain condition, the window either increases or closes (and a new window is created).

The Sliding Window pattern is useful for keeping track of a subset of data in an array/string etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the idea behind the Divide and Conquer pattern, and what is its biggest advantage?

A

The Divide and Conquer pattern involves dividing a data set into smaller chunks and then repeating a process with a subset of data.

The Divide and Conquer pattern can tremendously decrease time complexitiy.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is a recursive function, and how does it affect the call stack?

A

A recursive function is a function that keeps calling itself until it reaches a base case. A recursive function keeps pushing new functions onto the call stack.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Explain the use of the call stack in regards to functions?

A
  • Any time a function is invoked it is placed (pushed) on the top of the call stack.
  • When the compiler sees the return keyword or when the function ends, the compiler will remove (pop) the function from (the top of) the stack.
17
Q

What are three essential parts of a recursive function?

A
  1. Base case: the condition when the recursion ends. This is the most important concept to understand.
  2. Different input: each time the recursive function is called it must be called with different data.
  3. Return: the recursive function should return a value.
18
Q

What is Helper Method Recursion, and when is it commonly used?

A

A design pattern that uses two functions:

  • an outer function that called to invoke the function, and
  • an inner function (called helper function) which is a recursive function, and called by the outer function.

It is commonly used when needing to compile an array or a list of data.

19
Q

What is Pure Recursion?

A

A design pattern where the recursive function is totally self contained; so without the use of a nested or helper function.

20
Q

What is Linear Search?

A

The simplest way to search for a value; look at every element in an array and check if it’s the wanted value.

21
Q

What search method does JavaScript use on arrays for indexOf, includes, find, findIndex? What is the time complexity?

A

For these built-in array search methods, JavaScript uses a Linnear Search method. The time complexity is O(n).

22
Q

When can one use Binary Search? What problem solving pattern does it use? What is its time complexity, and what is the advantage vs Linnear Search?

A

This search method only works on sorted arrays.

It uses the Divide and Conquer problem solving pattern; rather than eliminating one element at a time, it eliminates half of the remaining elements at a time.

It has a time complexity of O(log n) and it is a much faster for of search than for example Linnear Search with time complexity of O(n).

23
Q

What are the 3 more elementary (simpler) and less efficient sorting algorithms? Why are they also known as quadratic algorithms?

A

These sorting algorithms are:

  • Bubble Sort
  • Selection Sort
  • Insertion Sort

Their Big O time complexity is O(n²).

24
Q

How does Bubble Sort work? What is its time complexity?

A

Sorting algorithm where on each iteration the larger value of an adjacent pair is swapped to the right, and as a result the largest value is “pushed” to the end of the array.

The array is sorted if no swap was made on an iteration.

The time complexity is: O(n^2)

25
Q

How does Selection Sort work? What is its time complexity?

A

Sorting algorithm where on each iteration the index of the smallest value is “selected”, and at the end of the iteration swapped to the beginning of the array.

The array is sorted if no swap was made on an iteration.

The time complexity is: O(n^2)

26
Q

How does Insertion Sort work? What is its time complexity?

A

Sorting algorithm that builds up the sort by gradually creating a larger sorted portion. It takes elements one by one, and inserts them in the right spot in the sorted left portion of the array.

The time complexity is: O(n^2)

27
Q

What are the 3 more intermediate and faster sorting algorithms?

A

These sorting algorithms are:

  • Merge Sort
  • Quick Sort
  • Radix Sort

Their Big O time complexity is O(n log n).

28
Q

What is the idea behind the Merge Sort algorithm? What is its Big O time complexity and why? What is the space complexity?

A

This sorting algorithm uses a combination of three things:

  • splitting up (into arrays of 0 or 1 elements)
  • sorting
  • merging (smaller arrays into new sorted array)

The Big O time complexity is O(n log n), space complexity O(n).

Time complexity is O(n log n) because of:

  • O(log n) decompositions when splitting
  • comparing O(n) items when merging
29
Q

What is Dynamic Programming?

A

A method of solving a complex problem where the problem is split into simpler problems, and the solutions of the simpler problems are used to find the solution of the original complex problem.

30
Q

What is the best Big O time complexity for data agnostic sorting algorithm?

A

O(n log n)

31
Q

How does Quicksort work? What is its Big O time complexity and space complexity

A

This divide-and-conquer sorting algorithm works by selecting a ‘pivot’ element from the array and partitioning the other elements into two sub-arrays, according to whether they are less than or greater than the pivot. The sub-arrays are then sorted recursively.

Big O time complexity (in worst case) is O(n^2) and space complexity is O(n).

32
Q

What is Dynamic Programming? And what properties should a problem have in order to use it?

A

A method for solving a complex problem by breaking it down into a collection of simpler subproblems, solving each of the subproblems just once and storing their solutions.

It only works on problems that have:

  • overlapping subproblems; can be broken down into subproblems which are reused several times, and
  • an optimal substructure; an optimimal solution can be constructed from optimal solutions of its subproblems.
33
Q

What is Memoization?

A

Storing the results of expensive function calls and returning the cached results when the same inputs occur again.

34
Q

In Dynamic Programming, what is Tabulation?

A

It is storing the result of a previous result in a “table” (usually an array). Usually done using iteration. Better space complexity can be achieved using it.