231-algorithms Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

Goals of algorithmic design

A

-time complexity- run as quickly as possible
- space complexity - take up the least amount of memory
- situational. if you have a lot of processing power, time is less important and you would focus on less data wastage (space), if you need a lot of data to be processed quickly, time is important

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Algorithm

A

Sequence of steps followed in order to solve a problem/perform particular task

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Big O Notation

A

Expresses the complexity (efficiency) of an algorithm in terms of how well it scales as dataset grows

takes in consideration best, average and worst case scenarios

approximation, allows you to predict the memory req/execution time it takes for an algorithm to finish given the data input

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Big O Notation Rules

A

Remove all terms except the one with the largest factor or exponent
Remove any constants
e.g O (n^2 + n + 1000) simplifies to O (n^2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Time Complexity

A

Number of cycles of processing time
running time required in memory depending on amount of data input
Time taken to complete the algorithm/solve the problem
How well an algorithm scales in the processing time taken/number of cycles as the problem dataset grows

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Space Complexity

A

running space required in memory depending on scale data input
Amount of storage algorithm takes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Reducing Space Complexity

A

Perform all changes on the original piece of data (copies take space)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Reducing Time Complexity

A

Reduce the number of embedded loops and number of items you have to complete the operations on

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Best case time complexities

A

Linear Search - O(1)
Binary Search array and tree - O(1)
Hashing - O (1)
Breadth/Depth first - O(1)
Bubble sort - O(n)
Insertion sort - O(n)
Merge sort - O(n log n)
Quick sort - O(n log n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Average case time complexity

A

Linear search - O(n)
Binary - O(log n)
Hashing - O(1)
Breadth/Depth - O(V+E)
Bubble sort - O(n^2)
Insertion sort - O(n^2)
Merge sort - O(n log n)
Quick sort - O(n log n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Worst case time complexity

A

Linear search - O(n)
Binary search array - O(log n)
Binary search tree - O(n)
Hashing - O(n)
Breadth/depth first - O(V^2)
Bubble sort - O(n^2)
Insertion sort - O(n^2)
Merge sort - O(n log n)
Quick sort - O(n^2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Worst Space time complexity

A

Linear search - O(1)
Binary search array- O(1)
Binary search tree- O(n)
Hashing- O(n)

Bubble sort - O(1)
Insertion sort - O(1)
Merge sort- O(n)
Quick sort- O(n) or log n (average)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

O (1) Constant Big O Notation

A

Algorithm will always take the same time/space
Independent from the input size
no loops or iterations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

O (n) Linear Big O Notation

A

Time/Space increases in direct proportion to input size
reduced efficiency as dataset grows
for loop/while loop

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

O (n^2) Polynomial Big O Notation

A

how well the algorithm scales in time taken/memory is directly proportional to square the power of n of the input size

efficiency is significantly reduced as dataset grows.
deeper nested iterations result in higher power

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

O ( log n) Logarithmic Big O Notation

A

amount of time/memory increases at a smaller rate as the size of input increases
Discards irrelevant data input e.g divide and conquer/halving the data set

17
Q

O (n log n) Linearithmic

A

divides a data set but can be solved using concurrency or divided lists
has linear and logarithmic growth rate factor to it. e.g
merge sort splits arrays logarithmically, and linear sots when merging them

18
Q

O (2^n) exponential big o

A

how the algorithm scales is exponential
With each data input, it doubles

recursive

19
Q

O (n!) factorial

A

multiplies an ever increasing amount for every extra item in the dataset, considers such permutation

Algorithm grows rapidly with each addition to input size. Factorial of input.

20
Q

searching algorithms

A

to find a specific element within a data structure, performed on a set of data
each algorithm suited to a particular data structure

21
Q

linear search

A
  • starts at beginning of dataset
  • traverses through every item one at a time in sequence
  • until it finds the item its searching for
    else if all items have been inspected, returns false
  • unsorted or sorted
  • easier to program/implement (linked list/array)
  • not efficient for large lists, ideal for smaller datasets
  • adding new item to list doesnt matter
22
Q

linear search pseudocode

A

.
Function linearSearch(list, item)
index= -1
i=0
found = False
while i < length(list) and found = False
if list[i]= item then
index= i
found= True
endif
i=i+1
endwhile
return index
endfunction

23
Q

binary search

A
  • starts at midpoint
  • uses divide and conquer to search for an item
  • halves search space with each iteration logical comparison, repeats until midpoint = item. uses 3 pointers: mid, low, high.
  • array/list must be sorted
  • implemented w/ array or binary tree
  • harder to program,
    more complex algorithm, so linear search on small datasets outperform it
    -very efficient for large data sets, needs sorted
  • additional time to sort data, and if adding new item
24
Q

binary search iterative pseudocode

A

binary search code (iterative)
function binarySearch(array,item)
low=0
high=array.length-1
while(low<=high){
mid=(low+high)/2
if (array[mid]>item)
high=mid-1
//discard the high half of the array
else if (array[mid] <item)
low=mid+1
else
return mid
//return index of mid
end if
end while
return -1
//returns -1 if value isn’t found
end function

25
Q

binary search recursive pseudocode

A

function binarySearch(array,item,low, high)
if (high<low)
return -1
end if
mid=(low +high)/2
if (array[mid] >item)
return binarySearch(array,item,low,mid-1)
//call BinarySearch on low half of array
else if(array[mid]<item)
return binarySearch(array,item,mid+1,high)
//call BinarySearch on high half of array
else
return mid
end if
end function

26
Q

sorting algorithm

A

designed to take in a number of elements and arrange them in logical order within data structure

27
Q

bubble sort

A
  • loops through every item of array
  • each item compared to its next, if out of order then swap
  • process repeated for each adjacent pair
  • flag variable stores if any swaps in pass
    when there are no swaps, algorithm ends
  • each pass, largest number bubbles to the end, no need to check that
  • ez to implement simple, inefficient for small datasets
28
Q

bubble sort pseudocode

A

function bubbleSort (array)
//pass in an unsorted arrat as parameter
n=array.length
//get the length of the array
swap=True
//set an initial flag to be true
while (swap=true)
//loop as long as swap remains true
swap=False
//set swap to false to assume no swaps this pass
for(i=0 to n-1)
//loop from 0 to number of items
if (array[i])>array[i+1])
//compare the index to it.s neighbour
swap=true
//if a swap happens set swap to true
temp=array[i]
//these lines swap the two values around by
array[i]=array[i+1]
//storing one in a temporary variable
array[i+1]=temp
end if
next i
n=n-1
//reduce the number of items as we know that the one on the end is defintely the biggest
end while
return array
//end by returning the sorted array
end function
.
A = Array of data
for i = 0 to A.length - 1:
noSwap = True
for j = 0 to A.length - 1:
if A[j] > A[j+1]:
swap A[j] and A[j+1]
noSwap = False
if noSwap:
break
return A

29
Q

insertion sort

A
  • by comparing every other in a linear way
    if value is greater than index, we shift that by 1 and continues until correct insertion position found.
  • starts at the second index, first value sorted. unsorted and sorted side.
  • iterate through unsorted. For each element, compare it with the sorted side elements. Shift sorted elements to right that are larger. Insert in correct position
    -until every item has been inserted

-similar to bubble sort but performs better if dataset already mostly sorted, less comparisons

30
Q

insertion sort pseudocode

A

insertion sort code
function insertionSort(array)
//aaray passed in as parameter
n=array.length
//get length of array
for i=1 to n-1
..loop from item 1(not 0!) to length -1
currentValue=array[i]
//gets value of current array indexx
position=i
//sets a variable to keep track of position to loop back
while position>0 and array[position-1]> currentValue
//sets up the loop backwards
array[position[=array[position-1]
//moves values to the right if bigger
position=position-1
//move left one
end while
array[position]=currentValue
//adds the item in at the correct insertion point
next i
return array
//return the sorted arrat
end function
.
A = Array of data
for i = 1 to A.length - 1:
elem = A[i]
j = i -1
while j > 0 and A[j] > elem:
A[j+1] = A[j]
j=j- 1 A[j+1] = elem

31
Q

merge sort

A
  • uses divide and conquer and recursion
  • array halves recursively until individual element
  • adjacent sub lists compared, merged
  • inspects first item of both sub lists and smallest is inserted into new list, until both lists are empty
  • more efficient, can work on multiple lists at the same time, large data sets,
  • hard to code, high space complexity as subsets need to be held in memory at once
  • ideal for parallel processing environments
32
Q

merge sort pseudocode section (splitting)

A

merge sort code
function mergeSort(mergeList)
//array is passed in as parameter
if (mergeList.length<2)
//checks the length isn’t one
return array
//if it is return the array
end if
//this is the terminator for the recursion
mid=mergeList.length DIV 2
//gets the midpoint using an integer division
leftHalf=mergeList.splice(0,mid)
//gets the left half of the array
rightHalf=mergeList.splice(mid,mergeList.length)
//gets the right half
return merge(mergeSorted(leftHalf),mergeSort9rightHalf)
//call the merge function on the mergeSort of the left and right halves.. This causes the recursion to trigger
end function

33
Q

merge sort pseudocode (merging)

A

function merge(leftHalf,rightHalf)
//this function merges two arrays
result=[];
//create an empty array for the result
i=0; j=0; k=0;
//declare counter variables
wile i<leftHalf.length AND j<rightHalf.length
//while there are still items in both subsets, iterate through
if leftHalf[i]<rightHalf[j])
//one by one working out which is smaller and
result[k]=leftHalf[i]
//adding them to the results array
i=i+1
else
result[k]=rightHalf[j]
j=j+1
end if
k=k+1
end while
while i<leftHalf.length
//if there are items left in the left half
result[k]=leftHalf[i]
//keep adding them until there are none left
i=i+1
k=k+1
end while
while j<rightHalf.length
//if there are items left in the right half
result[k]=rightHalf[j]
//keep adding them until there are none left
j=j+1
k=k+1
end while
return result
//return the sorted array
end function

34
Q

quick sort

A
  • divide and conquer algorithm
  • chooses pivot point, elements partitions into sublists according to whether less/greater than pivot
  • each side recursively sorted, pivot chosen. until each item is individual
  • uses concurrency to work on each partition at the same time

-depends on data being sorted and pivot value
- parallel processing environments, real time situations

35
Q

pathfinding algorithms

A

find optimal/satisfactory paths between vertices and nodes on graphs
calculates shortest path

36
Q

Dijsktra shortest path algorithm

A
  • finds shortest path between one node and all other on a weighted graph
    usage: e/g shortest path for packet routing, GPS navigation, telephone networking
  • implemented using a priority queue, smallest distances stored at front. breadth first
  • doesnt work for negative value edges
37
Q

dijkstra shortest path algorithm steps

A
  • make list of visited vertices, unvisited vertices
  • start vertex, add distances of neighboring nodes to priority queue
  • visit unvisited vertex with smallest known distance from start
  • if working distance is less than known distance, update table, mark as visited
  • ignore routes that are not the shortest
  • repeat until all visited
    -backtrack from final vertex
38
Q

A* algorithm (pathfinding algorithm)

A

Algorithm that finds the shortest path between two nodes
uses a heuristic function (value) to optimise performance, approximate cost from node x to final node using extra information
- once target node is visited, algorithm stops, not every vertex is considered
- It keeps a sorted priority queue of alternate path segments along the way

-when deciding on node to visit, looks at heuristic cost + actual cost of unvisited node compared to working total

  • effectiveness depends on accuracy of heuristics used
  • quicker, reduce the time taken to find a path
  • travel routing, video games NPCS, packet routing, financial modelling. pathfinding and graph traversal