2.4.1 Algorthms Flashcards
1
Q
Time complexity
A
- how much time an algorithm requires to solve a particular problem
- is measured using big O notation
- shows the amount of time taken relative to the number of data elements given as an input
2
Q
O(1) constant time complexity
A
- the amount of time taken to complete an algorithm is independent from the number of elements imputed
3
Q
Big O notation
A
- is used to express the complexity of an algorithm
4
Q
O(n) linear time complexity
A
- the amount of time taken is directly proportional to the number of elements inputted
5
Q
O(n^2) polynomial time complexity
A
the amount of time taken to complete the algorithm is directly proportional to the square of the number of data inputs
6
Q
O(2^n)
A
- the amount of time taken to complete an algorithm will double with every additional item
7
Q
O(2^n)
A
- the amount of time taken to complete an algorithm will double with every additional item
8
Q
O(logn)
A
The time taken to complete an algorithm increase much more slowly as number of data inputs increase
9
Q
Space complexity
A
- is the amount of storage an algorithm takes
- algorithms store extra data whenever they make a copy
- more store is more expensive
10
Q
Bubble sort
A
- makes comparisons and swaps between pairs of elements
- Starts at the first element in an array and compares it to the second
- elements are swapped if in the wrong order
- the processes is then repeated for every adjacent pair of elements in the array until the end of the array is reached (this is a single pass)
- multiple passes are made until the data is sorted
- for an array of n elements, the array will perform n passes though the data
- best case O(n) is for a sorted list
- average and with case O(n^2)
11
Q
Insertion sort
A
- works by splitting a list into 2: a sorted sub list and an unsorted sub list
- during each pass items from the unsorted list are inserted into the correct position in the sorted sublist
- the for loop is started at 1
- best case = O(n) when lost is already sorted
- worst case & avg case = O(n^2)
12
Q
Merge sort
A
- using a divide and conquer approach it splits a list recursively into sublists until each sublist is of size 1
- as the recursion unwinds, each pair of lists are merged with their items in order
- best worst and average case = O(nlogn)
- most efficient for large list
- overkill for nearly sorted or small lists
13
Q
Quick sort
A
- uses a divide and conquer approach and sorts a list by selecting a pivot value (first item by convention) [1]
- splits the remainder of the list into two sublists : those less than or equal to the pivot or those greater than the pivot
- by comparing each item to the pivot [1]
- recursively applies step 1 and 2 until all sublists are pivots/ quick sort the new lists [1]
- the pivots can now be combined to form a sorted list
- best & average = nlogn
- worst case = n^2
14
Q
Linear search
A
- can work on both ordered and unordered data sets [1]
- checks each element in the list sequentially until item is found or and of the list is reached [1]
- starts by getting first element, if it is equal to the item to be found report found [1]
- if not equal move to the next element [1]
- best case O(1)
- worst + avg case = O(n)
15
Q
Binary search
A
- only works on sorted data [1]
- works by repeatedly dividing a list in two and discarding
- O(logn)
- finds the index