Data Structures Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

What is a stack?

A

A stack is a set with a LIFO policy.

The element deleted from the set is the one most recently inserted.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is a queue

A

A queue is a set with a FIFO policy.

The element deleted from the set is the one that has been in the set for the longest time.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a linked list

A

A linked list is a data structure in which the object are arranged in a linear order.
Unlike an array, in which the linear order is determined by the array indices, the order in a linked list is determined by a pointer in each object.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Complexity: Copy list

A

Average case: O(n)

Amortized worst case: O(n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Complexity: Append to list

A

Average case: O(1)

Amortized worst case: O(1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Complexity: Pop last from list

A

Average case: O(1)

Amortized worst case: O(1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Complexity: Pop intermediate from list

A

Average case: O(n)

Amortized worst case: O(n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Complexity: Insert into list

A

Average case: O(n)

Amortized worst case: O(n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Complexity: Get item from list

A

Average case: O(1)

Amortized worst case: O(1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Complexity: Set item in list

A

Average case: O(1)

Amortized worst case: O(1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Complexity: Delete item from list

A

Average case: O(n)

Amortized worst case: O(n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Complexity: List iteration

A

Average case: O(n)

Amortized worst case: O(n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Complexity: Get slice from list

A

Average case: O(k)
Amortized worst case: O(k)

k = Size of slice

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Complexity: Delete slice from list

A

Average case: O(n)

Amortized worst case: O(n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Complexity: Set slice in list

A

Average case: O(k+n)
Amortized worst case: O(k+n)

k = size of slize

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Complexity: Extend list

A

Average case: O(k)
Amortized worst case: O(k)

k = size of 2nd list

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Complexity: Sort list

A

Average case: O(n log n)

Amortized worst case: O(n log n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Complexity: Multiply list

A

Average case: O(n * k)

Amortized worst case: O(n * k)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Complexity: x in list

A

Average case: O(n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Complexity: min() / max() of list

A

Average case: O(n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Complexity: Get length of list

A

Average case: O(1)

Amortized worst case: O(1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Complexity: Copy dequeue

A

Average case: O(n)

Amortized worst case: O(n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Complexity: Append to dequeue

A

Average case: O(1)

Amortized worst case: O(1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Complexity: Append left to dequeue

A

Average case: O(1)

Amortized worst case: O(1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Complexity: Pop from dequeue

A

Average case: O(1)

Amortized worst case: O(1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Complexity: Pop left from dequeue

A

Average case: O(1)

Amortized worst case: O(1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Complexity: Extend dequeue

A

Average case: O(k)
Amortized worst case: O(k)

k = number of items to extend

28
Q

Complexity: Extend left dequeue

A

Average case: O(k)
Amortized worst case: O(k)

k = number of items to extend

29
Q

Complexity: Rotate dequeue

A

Average case: O(k)
Amortized worst case: O(k)

k = Number of steps to rotate

30
Q

Complexity: Remove from dequeue

A

Average case: O(n)

Amortized worst case: O(n)

31
Q

Complexity: x in dict

A

Average case: O(1)

Amortized worst case: O(n)

32
Q

Complexity: Copy dict

A

Average case: O(n)

Amortized worst case: O(n)

33
Q

Complexity: Get item from dict

A

Average case: O(1)

Amortized worst case: O(n)

34
Q

Complexity: Set item in dict

A

Average case: O(1)

Amortized worst case: O(n)

35
Q

Complexity: Delete item from dict

A

Average case: O(1)

Amortized worst case: O(n)

36
Q

Complexity: Iterate dict

A

Average case: O(n)

Amortized worst case: O(n)

37
Q

How can k-ary trees represented memory efficiently?

A

By using a left-child right-sibling (LCRS) binary tree.

Each node contains at maximum two references: On to the leftmost direct child and on to its own right sibling.
In order to get the i-th child of a node it is necessary to access the left child and then iterate through its siblings.

This representation is very memory efficient, however accessing nodes may be slower.

38
Q

How does the division method for hash functions work?

A

A key k is mapped into one of m slots by taking the remainder of k divided by m:

h(k) = k mod m

39
Q

How does the multiplication method for hash functions work?

A

First, we multiply the key k by a constant A in the range 0 < A < 1 and extract the fractional part of kA.
Then, we multiply this value by m and take the floor of the result.

h(k) = floor (m (kA mod 1))

40
Q

Why is universal hashing used in favour of other hashing methods like division hashing?

A

If a malicious adversary chooses the keys to be hashed by some fixed hash function, then the adversary can choose n keys that all hash to the same slot, yielding an average retrieval time of O(n).

With universal hashing the hash function is chosen randomly. At the beginning of execution the hash function is selected randomly from a carefully designed class of functions.

41
Q

What is a universal hashing collection?

A

Let H be a finite collection of hash functions that map a given universe U of keys into the range {0, 1 , …, m-1}.

Such a collection is said to be universal if for each pair of distinct keys k,l ∈ U, the number of hash functions h ∈ H for which h(k) = h(l) is at most |H| / m.

With a hash function randomly chosen from H, the chance of a collision between distinct keys k and l is no more than the chance 1/m of a collision if h(k) and h(l) were randomly and independently chosen from the set {0, 1, …, m-1}

42
Q

Explain the difference between open addressing and chaining in hashing

A

In open addressing, keys are inserted at the specified hash position. If the slot is already occupied, the next slot is probed until an empty slot is found.
In accessing it works the same way: The slots are probed until the key is found, an empty slot is found or all slots have been probed. Since each slot can hold only one element, the load factor is <= 1

In chaining, keys hashed to the same slot are chained via linked lists. Since each slot can hold many elements, the load factor can grow beyond 1

43
Q

What is linear probing, quadratic probing & double hashing in open adressing?

A

Linear probing:

  • Probing works by testing slot by slot in sequence
  • Easiest to implement but susceptible for clustering
  • h(k,i) = (h’ (k) + i) mod m

Quadratic probing:

  • Probing uses two positive auxiliary constants, one of them in quadratic form
  • Less susceptible for clustering than linear probing but still possible
  • h(k,i) = (h’ (k) + c1 * i + c2 * i² ) mod m

Double hashing:

  • Probing works by using two auxiliary hash functions
  • The permutations produced have many of the characteristics of randomly chosen permutations
  • h(k,i) = (h1 (k) + i * h2 (k)) mod m
44
Q

What is perfect hashing?

A

We call a hashing technique perfect hashing if O(1) memory accesses are required to perform a search in the worst case.

Two levels of hashing are used, with universal hashing at each level.

The first level is essentially the same as for hashing with chaining: We hash the n keys into m slots using a hash function h carefully selected from a family of universal hash functions.

Instead of making a linked list of the keys hashing to slot j, however, we use a small secondary hash table S_j with an associated hash function h_j. By choosing the hash functions h_j carefully, we can guarantee that there are no collisions at the secondary level.

45
Q

What are the prerequisites to create a static perfect hash table?

A

All keys are static: once the keys are stored in the table, the set of keys never changes.

46
Q

What is the worst-case running time of basic operations on a binary tree with n elements?

A

O( n )

47
Q

What is the basic structure of a binary tree?

A

A binary tree is a tree structure, in which each element (node) contains either 0, 1 or 2 children. The children are called left and right child. Each node has a reference to its parent node, except for the root node. Node referencing 0 or 1 children are called leaf-nodes.

48
Q

What is the binary-search-tree property?

A

Let x be a node in a binary search tree. If y is a node in the left subtree of x, then y.key <= x.key. If y is a node in the right subtree of x, then y.key >= x.key.

In other words: All values in the left subtree of node x are less than or equal to x.key. All values in the right subtree of node x are greater than or equal to x.key

49
Q

Implement the inorder_tree_walk procedure.

What is its complexity?

A

def inorder_tree_walk(node:Node):
if node.left:
inorder_tree_walk(node.left)

print(node.key)

if node.right:
    inorder_tree_walk(node.right)

Time complexity: O (n) (each element is visited once)

50
Q

Time complexity of search procedure on a binary tree

A

O(h), where h is the height of the binary tree

51
Q

Time complexity of min/max procedure on a binary tree

A

O(h), where h is the height of the binary tree

52
Q

Time complexity of min/max procedure on a binary tree

A

O(h), where h is the height of the binary tree

53
Q

Time complexity of predecessor/successor procedure on a binary tree

A

O(h), where h is the height of the binary tree,

since we either follow a simple path up the tree or follow a simple path down the tree

54
Q

Time complexity of binary tree insert node

A

O(h), where h is the height of the binary tree,

55
Q

Time complexity of binary tree delete node

A

O(h), where h is the height of the binary tree,

56
Q

Describe the deletion of a node from a binary tree

A

Deletion can be done in 3 different node states:

  • Deleted node is leaf node: Just delete the node
  • Deleted node has a single child: Replace the deleted node with its child
  • Deleted node has 2 children: The trickiest state. We have to re-structure the tree. We are basically replacing the deleted node with its successor.
    • Option 1: Successor of deleted node is its right child: We can replace the deleted node with its successor node and re-assign the left child of the deleted node to the left child of the successor node (successor node is the minimum of the right subtree, so it won’t have a left node)
    • Option 2: Successor of deleted node is not its direct right child: The hardest part. First we replace the successor’s with its own right child. Secondly we set the successor to be the parent of the deleted node’s right child. Then, we set the successor to be the deleted node’s parent child and the parent of the deleted node’s left child.
57
Q

What is a red-black tree

A

A red-black tree is a binary search tree with one extra bit of storage per node: its color, which can be either red or black. By constraining the node colors on any simple path from the root to a leaf, red-black trees ensure that no such path is more than twice as long as any other, so that the tree is approximately balanced.

58
Q

What are the properties of a red-black tree?

A
  1. Every node is either red or black
  2. The root is black
  3. Every leaf (NIL) is black
  4. If a node is red, then both its children are black
  5. . For each node, all simple paths from the node to descendant leaves contain the same number of black nodes
59
Q

What is the advantage of red-black tress compared to normal binary search trees?

A

On binary search trees the basic operations like search, min, max, successor, predecessor, insert & delete run in O(h). This is fine on balanced trees, however if the tree is unbalanced, the runtime becomes worse ( O(n) on a completely unbalanced tree).
Red-black trees keep the tree more or less balanced, so that we can guarantee to run the operations in O (lg n) time.

60
Q

What is the runtime of all basic operations on a red-black tree?

A

O( lg n )

61
Q

What is the runtime of a left or right rotation in a binary search tree?

A

O(1)

It runs in constant time as only a constant number of pointers are updated

62
Q

What is the difference between a binary tree and a binary search tree?

A

A binary search tree holds an additional condition: All left descendants <= n < all right descendants.

63
Q

What is a balanced tree?

A

In a balanced tree all branches have roughly the same height.

64
Q

What is a complete binary tree?

A

A complete binary tree is a binary tree in which every level of the tree is fully filled, except for perhaps the last level. To the extent that the last level is filled, it is filled left to right

65
Q

What is a full binary tree?

A

A full binary tree is a binary tree in which every node has either 0 or 2 children.

66
Q

What is a perfect binary tree?

A

A perfect binary tree is a binary tree that is both full and complete. All leave nodes will be at the same level, and this level has the maximum number of nodes.