Week 2 Flashcards
Why do we use asymptotic notation to analyse run time of algorithms in theoretical analysis
Usually an algorithm that is asymptotically more efficient is the best choice for all but very small inputs
- we can also use asymptotic notation to characterise other features of algorithms such as space (memory) used
What are data structures?
data structures are specific methods for storing and accessing information
Describe the features and functionality of arrays
- arrays are unordered collections of data, that have indexes in order to access them
- must specify the size of the array upon creation (which we may not always know)
- if we run out of space we have to create a new larger array and copy the items over
- difficult to insert into the middle of the array, must insert in item and then shift all others down
describe features of linked lists
- data can be inserted anywhere easily by inserting a new node and reassigning pointers
- can perform referring, insertion, deleting and searching on the linked list
- can be implemented as singly or doubly linked lists
describe the difference between singly and doubly linked lifts
- In singly linked lists each node has a next pointer and the last element just points to null
- in a doubly linked list each node points to a previous and next node, the first node in the list’s prev pointer points to null, and the last node in the list’s next pointer points to null
Why would we use linked lists over arrays?
They are much more efficient
name all the update methods that can performed on a linked-list abstract data type
- replaceElement(p,e) - p - position, e - element, find item at position p and replace it with value e
- swapElements(p,q) - p,q - two positions, swap contents in elements at positions p and q
- insertFirst(e) - insert into the first position value e
- insertLast(e) - insert into last position value e
- insertBefore(p,e) - p - position, e - element, insert value e before position p
- insertAfter(p,e) -p - position, e - element, insert value e after position p
- remove(p) - remove element at position p
What are the steps for an element insertion of insertAfter(p,e)
- create a new node v
v.element = e - link v to previous node
v.prev = p - link v to it’s successor
v.next = p.next - link p’s old successor to v
(p.next).prev = v - link p to new successor v
p.next = v
What is the cost of the insertion and removal methods for a linked list?
- if the position of the node to update is known cost is O(1)
- if the position of the node is unknown cost = O(p) where p is the position in the list as we need to traverse to it to find it
What is the advantage of using an array over a linked list?
They are easier and quicker to search through due to use of indexes
Describe the features of an ordered dictionary
- a set of elements and their associated keys, it maintains orders of elements
- supported by dictionary operations
What is a look up table/sorted table
- taking an ordered dictionary and storing it as an array or vector, so that it’s easier to search or sort through
describe the pseudocode for a binary search algorithm
- input is an ordered array of elements, with the indexes of the highest and lowest elements (keys wise)
- the midpoint of the list is found, if key is the midpoint, return the element
- otherwise if the key is bigger than the midpoint return the recursive call of binary search with new parameters of the elements in the list between the midpoint and highest element
- otherwise return the recursive call of binary search with new parameters of the elements in the list between the midpoint and lowest element
- keep recursively calling until the base case is hit and the algorithm unwinds itself
What is the recursive relation for the recursive binary search algorithm and what is the time complexity of this algorithm?
T(n) = {b, if n < 2 : t(n/2) + b, if n >= 2}
- where b is a constant that denotes the time for updating low and high and other overhead
- since n/2 is the highest order term and it can be shown the binary search performs 2^n operations
- which can be written as log2(n) operations, the time complexity = O(logn)
Why is T(n) = {b, if n < 2 : t(n/2) + b, if n >= 2} the recursive relation for the recursive binary search algorithm?
- If n < 2, it either contains one element in which we know the position of or it contains no elements so is empty so it can’t contain the element we’re searching for
- If n >= 2, time complexity is O(log2(n)) as we start with n elements and halve this every recursive call until n < 2. So this is n/2^i <= 1 which we can rewrite as log2(n) = i
Compare the time complexities of the linked list and a sorted array (look up table) for methods findElement, insertItem, removeElement, closestKeyBefore:
linked list:
findElement - O(1)
insertItem - O(n)
removeElement - O(1)
closestKeyBefore - O(1)
sorted array:
findElement - O(logn)
insertItem - O(n)
removeElement - O(n)
closestKeyBefore - O(logn)
Compare linked lists to sorted arrays in words:
- Linked lists have a shorter run time in inserting and deleting items
- Arrays have a shorted run time in searching for items
Why are binary search trees an efficient data structure of linked lists and arrays?
- They can efficiently, search, sort, delete and insert items into the BST in a faster run time
Describe the features of a rooted tree ADT
- A set of nodes, that store parent, child relationships
- T is the root node, meaning it has no parent, every node apart from T has a parent
- A leaf is a node with no child nodes
What are sibling nodes defined as having?
Sibling nodes have the same parent node
What does it mean if a tree ABT is said to be ordered?
a tree is ordered if there is a linear ordering defined for the children of each internal node
What are binary search trees?
- These are rooted trees where each node has no more than 2 children and are ordered in a specific
- each child is labelled either a left child or a right child.
What does it mean for a binary tree to be ‘Proper’
It means that each node in the tree has exactly 2 children.