CS algorithms Flashcards
Linear Search
we simply start at the beginning of our data set and iterate through each item until we either i) find what we were searching for or ii) reach the end and conclude the item being searched for is not in the set.
Issues with Linear Search
it can be very time consuming.
Binary Search
Takes an ordered set of data and starts searching in the middle. If the middle element is what we were looking for, great! That means we’re done. If the middle element is smaller than the target, we can eliminate the entire left-hand side of our data set and repeat this process on the right-hand side.
Issues with Binary Search
can’t use binary searching unless the data is sorted!
Selection Sort
several playing cards you need to put in order. You start off by looking for the lowest card, and when you find it, put it at the front of your hand. Then, you look for the second lowest card and insert it directly behind the first card. You repeat this process, selecting the next lowest card and moving it behind the most recently placed card, until you have moved all cards into the right place.
drawbacks to selection sort
One of the drawbacks of this algorithm is its efficiency. Regardless of the starting order of your elements (random, mostly sorted, reverse sorted, etc.), this algorithm will always have a runtime of O(n²). This is because each pass requires us to check every element of the unsorted part of our list or array.
Is Selection Sort a comparison sorting algorithm?
Yes, since to select the desired element, we compare a current value to the rest of the “unsorted” segment of our list or array.
Why do we end our loop before processing the item in the last index of an array when performing Selection Sort?
By the nature of how this algorithm works, if all other elements have been sorted, the last element will have been moved to the correct index.
When using Selection Sort, what is the difference between the order or arrangement of elements in best case versus worst case?
None, since the same number of operations will be performed regardless of how elements are ordered.
Show how the array [4, 8, 3, 1, 9, 6] changes as it is being sorted using Selection Sort. To do this, write out the contents of the array after each pass of the algorithm. (hint…we should only need n-1 passes to sort the array, where n is the size)
[1, 8, 3, 4, 9, 6] [1, 3, 8, 4, 9, 6] [1, 3, 4, 8, 9, 6] [1, 3, 4, 6, 9, 8] [1, 3, 4, 6, 8, 9]
Insertion Sort
Imagine you lay out several playing cards in a line on a table. Consider the first card to be a sorted subarray of lengh = 1. Pick up each card (in this case, from left to right) and insert it into the correct position in the sorted subarray. Shift other cards to the right to make room for the card you are inserting as you go. O(n²).
main steps of Insertion sort
- Separate the first element from the rest of the array. Consider it a sorted list of one element.
- For all other indices, beginning with [1]:
a. Copy the current item into a temp variableb. Iterate to the left until you find the correct index in the “sorted” part of the array at which this element should be inserted
- Shift items over to the right as you iteratec. When the right index is found, copy temp into this position
Selection Sort steps
- Start with current index = 0
- For all indices EXCEPT last:a. Loop through elements on right-hand-side
of current index and find smallest elementb. Swap element at current index with
smallest element found in above loop
Is Insertion Sort a comparison sorting algorithm?
Yes, since to find the correct index at which we “insert” the current element we are sorting, we compare it to the rest of the “sorted” segment of our list or array.
When using Insertion Sort, what is the difference between the order or arrangement of elements in best case versus worst case?
When performing Insertion Sort, the algorithm runs more efficiently if the elements are already mostly sorted. When the original array is close to sorted order, fewer comparisons are required to sort it. But if the array is in reverse-sorted order, the number of comparisons that need to be done increases significantly. For example: [5, 4, 3, 2, 1]
When inserting the 1 into the correct part of the array, we have to compare it with every other number. While this is not such a big deal in an array with 5 elements, it gets very costly as that array becomes bigger.
Show how the array [4, 8, 3, 1, 9, 6] changes as it is being sorted using Insertion Sort.
[4, 8, 3, 1, 9, 6] (comparison done, no swap made)
[3, 4, 8, 1, 9, 6]
[1, 3, 4, 8, 9, 6]
[1, 3, 4, 8, 9, 6] (comparison done, no swap made)
[1, 3, 4, 6, 8, 9]
Merge Sort
- Recursively divide orginal array or list in half until both pieces have a length of 1.
- Starting with these singular subsets, merge sorted pieces back together.
Quick Sort
In this algorithm, we move smaller elements to one side of the array and larger elements to the other side of the array by comparing them to a “pivot” element, which can be chosen in a few different ways. This divides our original array into two subarrays that are closer to being sorted. We then recursively call Quick Sort on each subarray until we are down to singular elements (which by their singular nature, must be sorted) runtime: best case: Ω(nlog(n)) worst: O(n²)
Quick Sort steps
While the data set has a length >= 1:
- Select pivot (typically first, last or middle)
- For each item in data set:
- if item < pivot AND not on LHS of pivot, move to appropriate side of data set
- else if item > pivot AND not on RHS of pivot, move to appropriate side of data set - Recursively Quick Sort LHS and RHS of pivot
Show how Quick Sort for array [2, 7, 5, 4, 0, 1]
[2, 7, 5, 4, 0, 1] // pivot = 2
[0, 1, 2, 4, 7, 5]
[0, 1, 2, 4, 7, 5] // LHS pivot = 0
[0, 1, 2, 4, 7, 5] // LHS pivot = 0
[0, 1, 2, 4, 7, 5] // LHS done, RHS pivot = 1 (single element)
[0, 1, 2, 4, 7, 5] // Quicksort RHS of 2
[0, 1, 2, 4, 7, 5] // Quicksort RHS of 2, pivot = 4
✓ ✓ ✓ ✓
[0, 1, 2, 4, 5, 7] // Quicksort RHS of 4, pivot = 7
✓ ✓ ✓ ✓
[0, 1, 2, 4, 5, 7] // Quicksort LHS of 7, pivot = 5 (single element)
Time complexity: What is Constant time 1
No matter how many elements we’re working with, the algorithm/operation/whatever will always take the same amount of time
Time Complexity: What is logarithmic Time log(n)
You have this if doubling the number of elements you are iterating over doesn’t double the amount of work. Always assume that searching operations are log(n)
Time Complexity: Linear Time: n
Iterating through all elements in a collection of data. If you see a for loop spanning from 0 to array.length you probably have ‘n’ or linear runtime
Time Complexity: Quasilinear Time: n*log(n)
You have this if doubling the number of elements you are iterating over doesn’t double the amount of work. Always assume that any sorting operation is n*log(n)
Time Complexity: Quadratic Time: n^2
every element in a collection has to be compared to every other element. ‘The handshake problem’
Time Complexity: exponential time 2^n
if you add a single element to a collection the processing power required doubles
Time Complexity: worst to best
O(n!) => O(2^n) => O(n^2) => O(n log n) => O(n) => O(log n), O(1)
What is Fibonacci series, what is the sequence?
Fibonacci Sequence is the series of numbers: 0, 1, 1, 2, 3, 5, 8, 13, 21, 34. The next number is found by adding up the two numbers before it.
What is the fibonacci formula using recursion?
def fib(n): if n < 2: return n else: return fib(n-1) + fib(n -2)
What is the fibonacci formula using recursion/cache?
def fib(n, cache): if n < 2: return n elif cache and cache[n] > 0: return cache[n] else: if not cache: cache = {i: 0 for i in range(n +1)} cache[n] = fib(n-1) + fib(n -2) return cache
What are Data Structures used for
Ways of organizing information with optimal runtime complexity for adding or removing records
What is a Queue and does the data structure look like?
a queue is a line or array like: FIFO so first in first out
rec = record
rec»_space; [rec»_space; rec»_space; rec» rec]»_space; rec
Pros of a Linked list
Easier to insert into and delete from the middle of a linked list compared to an array
We can keep adding elements to linked lists as much as we want unlike arrays with a static amount of allocated memory
Cons of a linked List
not as cache-friendly since caches are typically optimized for contiguous memory accesses
Pros of Queues
Queues are simple and intuitive to use and implement
They can be used to serialize data coming in from multiple sources
Cons of Queues
Are not general-purpose at all in and of themselves.
cannot be used for anything else. Only one function
What are trees
trees can be thought of as linked lists, but without the constraint that each node only points to one other node. A tree node can point to multiple other nodes in the tree. Linked lists themselves count as trees.
what is the root of a tree
The topmost node in the tree
Tree: what is a child
a node directly connected to another node when moving away from the root node
Tree: Parent
a node directly connected to another node when moving towards the root node
Tree: siblings
Nodes that share the same parent are considered siblings
Tree: siblings
A node that does not have any children of its own
How are binary search tree organized. How is the data split up?
For any given node all values in the left subtree are less than the value at the given node. Conversely, all values in the right subtree are greater than or equal to the value at the given node
14
/ \
9 22
How are binary search tree organized. How is the data split up?
For any given node all values in the left subtree are less than the value at the given node. Conversely, all values in the right subtree are greater than or equal to the value at the given node
14 / \ 9 22 / \ 8 11
Pros binary search tree
searching for an element in a binary search tree is significantly more efficient than searching through an array or linked list
Cons binary search tree
as a trade-off, it is not as efficient to insert into a binary search tree
the performance of a binary search tree depends quite a lot on whether the tree is balanced or not.
What is a heap?
priority: you keep the data you use more often at the top of the heap so they are easier to reach.
Think about it like a queue, organized with the max value in front.
it also could be a binary tree
or a hash table which is like a dictionary in python
or array
pros of heaps
searching for an element in a binary search tree is significantly more efficient than searching through an array or linked list
Cons of heaps
as a tradeoff, it is not as efficient to insert into a binary search tree
the performance of a binary search tree depend quite a lot on whether the tree is balanced or not.