Data Structures 4 Flashcards

1
Q

Algorithm Analysis

A

how long it takes a computer to do something

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Algorithm

A

has input, produces output, definite, finite, operates on the data it is given

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Big-Oh

A

T(n) = O(f(n)) if there are positive constants c & n° such that T(n) <= c * f(n) for all n >= n°

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Big Omega

A

T(n) = Ω(f(n)) if ∃ positive constants c & n° such that T(n) >= c * f(n) for all n >= n°

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Little-Oh

A

T(n) = 0(f(n)) if T(n) = O(f(n)) and T(n) != Ω(f(n))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Queues

A

First in, first out. O(1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

L N R

A

in-order traversal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

N L R

A

Preorder traversal (Polish)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

L R N

A

Postorder traversal (Reverse Polish)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Insertion Sort

A

Stable, O(n^2), Ω(n) : Swapping elements one at a time starting at the beginning.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Selection Sort

A

Unstable, O(n^2), Ω(n^2) : Iterates through every elements to ensure the list is sorted.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Bubble Sort

A

Stable, O(n^2), Ω(n) : Compares neighboring elements to see if sorted. Stops when there’s nothing left to sort.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Heap Sort

A

Unstable, O(n log n), Ω(n log n): Make a heap, take everything out.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Tree Sort

A

Stable, O(n log n), Ω(n log n) : Put everything in the tree, traverse in-order.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Merge Sort

A

Stable, O(n log n), Ω(n log n): Use recursion to split arrays in half repeatedly. An array with size 1 is already sorted.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Quick Sort

A

Unstable, O(n log n) for a good pivot,O(n^2) for a bad pivot Ω(n log n) : Uses partitioning O(n), Pick a median of 1st, middle, and last element for pivot. Random selection is also good, but expensive. Algorithm can be slow because of many function calls.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Insertion & Quick Sort

A

Using both algorithms together is more efficient since O(n log n) is only for large arrays.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Indirect Sorting

A

Involves the use of smart pointers; objects which contains pointers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Bucket Sort

A

O(n+m) where m is the # of buckets.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Graphs

A

”"”Uber”” data structure. Shows connections between objects. Can be displayed as either a matrix or linked list representation.”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

DAG

A

directed acyclic graph

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Strongly connected

A

A directed graph in which there exists a path between every pair of vertices.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Weakly connected

A

A path between every pair of vertices which are undirected.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Depth First Search

A

Runs in time equal to the size of the graph, can determine if a graph has a cycle.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Topological Sorting

A

Receives a DAG as input, outputs the ordering of vertices. Selects a node with no incoming edges, reads it’s outgoing edges.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Single Source Shortest Path

A

Input: Graph and starting vertex.Output: shortest path to all points.Unweighted: BFSWeighted: Dijkstra’s Method

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Dijkstra’s Method

A

“Calculates the shortest path to all vertices in a single source shortest path using a priority queue, or a heap. Check’s ““frontier”” based on cost. The distance to any node is known once it has been ““visited””.”

28
Q

Relaxation

A

Getting from A->C more cheaply by using B as an intermediary.

29
Q

Negative Edge Costs

A

Dijkstra’s cannot solve. Requires Bellman-Ford.

30
Q

Bellman-Ford

A

Allowed to reconsider costs of reaching vertexes. Can detect negative cost cycles. Able to handle negative graphs by performing relaxation on all edges V-1 times where V is the number of vertices.

31
Q

All Pairs Shortest Path

A

Can be solved using Floyd-Warshall.

32
Q

Floyd-Warshall

A

Implicitly determines shortest paths taking into account all vertexes.

33
Q

Minimum Spanning Tree

A

Acyclic, contain all vertexes. Can be approached with either Prim’s or Kruskal’s method.

34
Q

Prim’s

A

Same overall algorithm as Dijkstra’s except that it only considers lowest cost of single edge. Continually builds onto a tree with the cheapest cost edges.

35
Q

Kruskal’s

A

Takes edges in sorted order by cost, creates many trees which join into one large tree.

36
Q

Disjoint Sets

A

Never allowed to break apart sets. Also known as Union/Find Algorithm. Each node has a parent pointer which points to a representative for each set

37
Q

2 Ways to Improve Disjoint Sets

A

Union By Rank - make smaller tree point to larger tree.Path Compression - Updating parent pointer directly to root.

38
Q

Abstract Data Types

A

Consists of 2 parts:1. Data it contains2. Operations that can be performed on it

39
Q

Parameter Passing

A

Small, no modification - valueLarge, no modification - CONST referencemodified - pointer

40
Q

Iterators

A

“An object that knows how to ““walk”” over a collection of things. Encapsulates everything it needs to know about what it’s iterating over. Should all have similar interfaces. Can read data, move, know when to stop.”

41
Q

AVL Trees

A

Adelson-Velskii & Landis: Any pair of sibling nodes have a height difference of at most 1. On insertion, at most one rotation (single or double) is needed to restore balance. On removal, multiple rotations may be necessary.

42
Q

Single Rotation

A

Rotation preserves order. Inner children become the child of the node which was replaced.

43
Q

Double Rotation

A

Two single rotation at different locations, either right-left or left-right. First rotation is deeper than the second.

44
Q

Splay Tree

A

Any valid BST. Amortized O(log n) access. M operations take O(m log n) for m being large #s. Any node getting inserted, removed, or accessed, get’s splayed to the root.

45
Q

Spatial locality

A

Close by in memory

46
Q

Temporal locality

A

Accessing something over a short period of time

47
Q

Red Black Trees

A
  1. Every node is Red or Black2. The root is Black3. If a node is red, it’s children must be black4. Every path from a node to a NULL pointer must contain the same number of black nodes
48
Q

B-Trees

A

Popular in disk storage. Keys are in nodes. Data is in the leaves.

49
Q

Heap

A

A type of priority queue. Stores data which is order-able. O(1) access to highest priority item.

50
Q

Trie

A

Has only part of a key for comparison at each node.

51
Q

Hash Table

A

Constant access time (on average).

52
Q

Hash function

A

takes an object and tells you where to put it.

53
Q

Collision

A

Entering into a space already in use.

54
Q

Load Factor

A

items(n) / table size

55
Q

Separate Chaining

A

Uses a linked list to handle collisions at a specific point.

56
Q

Open addressing

A

Uses probes to find an open location to store data.

57
Q

Linear Probing

A

Checks each spot in order to find available location, causes primary clustering.

58
Q

Quadratic Probing

A

Checks the square of the nth time it has to check, causes secondary clustering. Not guaranteed to find an open table spot unless table is 1/2 empty.

59
Q

Double Hashing

A

The process of using two hash functions to determine where to store the data.

60
Q

Rehashing

A

Expanding the table: double table size, find closest prime number. Rehash each element for the new table size.

61
Q

Lazy Deletion

A

Marking a spot as deleted in a hash table rather than actually deleting it.

62
Q

Prime number Tables

A

Reduce the chance of collision.

63
Q

Bloom Filters

A

Probabilistic hash table. No means no. Yes means maybe. Multiple (different) hash functions. Can’t resize table. Also can’t remove elements.

64
Q

Important Sorting Assumptions

A

1.Sorting array of integers2. Length of array is n3.Sorting least to greatest4.Can access array element in constant time5.Compare ints in array only with ‘6.Focus on # of comparisons

65
Q

Average Lower bound for adjacent swaps

A

n(n-1)/4 Ω(n^2)

66
Q

Inversions

A

Min: 0Max: n(n-1)/2Swapping removes 1 inversion

67
Q

4 Rules of Recursion

A

Base Cases: You must always have some base cases, which can be solved without recursion.Making Progress: For the cases that are to be solved recursively, the recursive call must make progress to a base case.Design rule: Assume that all recursive calls workCompound Interest Rule: Never duplicate word by solving the same instance of a problem in separate recursive calls.