Block 1: Introduction and Basic Concepts Flashcards
Define an algorithm
A specific procedure for solving a specific task
5 things that impact the running time of a program
> programming language > hardware > pattern > size of the input data > algorithm used
what does f(n) = O(g(n)) roughly mean?
f(n) < c * g(n), for some constant c, large enough n
describe the model we use for reasoning about program speed?
> 1 CPU core (single thread)
sequential
primitive operations take unit time
define running time
the number of steps it takes to finish for data of size n.
Worst Case
the longest time the algorithm takes for size n
Best Case
the shortest time the algorithm takes for size n
Average Case
the expected time for size n
O(f(n)) (big-oh)
upper-bound: time(n) < c * f(n)
Ω(f(n)) (omega)
lower bound: time(n) > c * f(n)
Θ(f(n)) (theta)
upper and lower bound
Big-oh of selection sort
O(n^2)
Worst case of insertion sort
O(n^2)
Best case of insertion sort
O(n)
Big oh of merge subroutine
O(n)
Big oh of mergesort
O(nlogn)
1 + 2 + 3 + 4 + … + n
(n(n+1))/2 ~ n^2 / 2
what are 2 sources of log n?
if something is repeatedly doubled or halved.
does the basis of a logarithm matter?
no
formal definition of big-oh
f (n) = O(g(n)) as n → ∞ if and only if:
There exist constants c > 0, n0 ≥ 0 such that
for every n ≥ n0 we have f (n) ≤ c · g(n)
3n^2 + 6nlogn + n
Θ(n^2)
8logn + 4n
Θ(n)
sort into increasing speed of growth: > n log n > n / (log n) > 2^n > log n > n > n^2 > n^1000 > n! > 1
> 1 > log n > n / (log n) > n > n log n > n^2 > n^1000 > 2^n > n!
sorting algorithms:
? is faster than ? is faster than ?
insertion sort, merge sort, selection sort
what is a recursive algorithm?
an algorithm that calls itself on smaller data to solve a problem.
define base case
where the recursion bottoms out. and the problem is simple enough to be solved directly.
3 things that happen when a recursive call is made
- the current state is remembered (put on call stack)
- new working space is created for the next function
- after the called function returns, the calling function resumes
What does the running time of quicksort strongly depend on?
getting balanced splits
unlucky quicksort big oh
O(n^2)
what is the likely big oh of quicksort with a random pivot element?
O(n log n)
add new item to the start of a list
Θ(n)
delete item at the start of a list
Θ(n)
add new item at the end
Θ(1)
delete item at the end
Θ(1)
add/delete at position i
Θ(n - i)
get/set at position i
Θ(1)
disadvantage of linked lists
harder to navigate:
> no random access
> not local in memory
advantage of linked lists
easier to modify and extend:
> insets/delete in constant time
> can even delete items from the middle
What are the 4 stages to delete a node in a linked list?
- node.prev.next = node.next
- node.next.prev = node.prev
- node.prev = null
- node.next = null
What are the stages to inset node2 after node1?
- node2.next = node1.next
- node2.prev = node1
- node1.next.prev = node2
- node1.next = node2
What linked list operations take constant time?
insertFirst, insertLast, deleteFirst, deleteLast, deleteNode (if you already have the node pointer)
What linked list operations take linear time?
locate an item by value, access the ith item, delete an item by value