Data Structures & Algorithms Flashcards
Describe 3 differences between a linked list and an array
Linked list
- Non-contiguous space requirement. Ex: Data doesn’t HAVE to be in same space on disk.
- Can be more efficient for memory allocation. Easier to resize.
- Can be less efficient for caching (not contiguous)
- No “find by index” feature, so accessing individual elements can be slower
- Easier to add elements (insert nodes) to various positions within list.
Array
- Must be in same contiguous disk space.
- Harder to resize. Also, increasing length may require moving the array in memory to a larger space.
- Better caching - all addresses are contiguous
- Fast access - search by index
- Harder to insert new elements to front and within the array as you then have to shift existing elements after the new element is added.
What is the difference between a Binary Tree and a Binary Search Tree?
A Binary Search Tree has all left descendants less than or equal to node, and all right descendants greater than node.
Describe the 3 types of Binary Search Tree traversal
In-order (from the smallest number to the biggest): Visit left branch, then current node, then right branch. Visits all nodes in ascending order on a BST.
Pre-order:
Visits current node first, then left branch, then right branch. Root node is always first node visited.
Post-order: Visits left branch, then right branch, then current node. Root node is always last visited.
What are 4 Algorithms for searching through arrays and trees
Linear search - look through an array one-by-one until you find what you are looking for.
Binary search - narrow it down by always guessing in the middle of the range. So you would start at 50 (half-way between 0 and 100), then if you were too high go to 25 (half-way between 0 and 50), then if you were too low go to 37 (half-way between 25 and 50), and so on.
Depth-first search - traverse as far as you can down the tree before back-tracking.
Breadth-first search - search works across the rows of a tree (the top row will be handled first, then the second row, and so on)
How does merge (or insert) sorting work for arrays?
Merge sort:
- Divide the unsorted list into n sublists, each containing 1 element (a list of 1 element is considered sorted).
- Repeatedly merge sublists to produce new sorted sublists until there is only 1 sublist remaining. This will be the sorted list.
Insertion sort:
- begin at the left-most element of the array and invoke Insert to insert each element encountered into its correct position.
- The ordered sequence into which the element is inserted is stored at the beginning of the array in the set of indices already examined.
What is faster than a bubble sort?
Insert, Shell, Heap, Merge, Quick and Quick3
What are array sorting methods by speed?
The slowest is selection, then bubble, then insertion, then shell, then merge, then heap, then quick, then quick3
The SELECTED, BUBBLE can’t INSERT into the SHELL while MERGING the HEAP QUICKLY
Imagine you have an array of numbers. Write an algorithm to remove all numbers less than five from the array.
let removeSome = ( arr ) => { let newArr = []; for ( let i = 0; i < arr.length; i++ ) { if ( arr[ i ] > 5 ) { newArr.push( arr[ i ] ) } } return newArr }
Imagine you have two arrays which have already been sorted. Write an algorithm to merge the two arrays into a single array, which should also be sorted. For example, if your input arrays were [1, 3, 6, 8, 11] and [2, 3, 5, 8, 9, 10], your output array should be [1, 2, 3, 3, 5, 6, 8, 8, 9, 10, 11]
let combineSort = ( arr1, arr2 ) => { let newArr = arr1.concat( arr2 ) newArr.sort( ( a, b ) => { return a - b } ) return newArr }
Given an array of numbers, write an algorithm to find out the products of every number, except the one at that index. For example, if the input was [1, 3, 9, 4], the output should be [108, 36, 12, 27] (i.e. [394, 194, 134, 139])
let multiplyOthers = ( arr ) => { let resultArr = [] for ( var num of arr ) { let index = arr.indexOf( num ) let newArr = arr.slice( 0, index ).concat( arr.slice( index + 1, arr.length ) ) for ( var i = newArr.length - 1; i > 0; i-- ) { newArr.splice( i, 0, '*' ); } let result = eval( newArr.join( '' ) ) resultArr.push( result ) } return resultArr
Write an algorithm to find the middle element of a linked list without using the .length property
let findMiddle = () => { //define the slow pointer going one element at a time let slow = first; //define the fast pointer going 2 elements at a time let fast = first;
while (fast !== null && head.next !== null) { //parse the linked list 2 elements at a time fast = head.next.next; //parse the linked list one element at a time slow = head.next; } //by the time the fast pointer reaches the end of the list the slow one is in the middle return slow; }
How might you determine the efficiency of an algorithm, and why is it important.
Determined by Big O notation. It’s important so the code is as efficient as possible. Speed is important, and planning for growth / scaling up.
What types of complexities are described in Big O Notation?
- Time complexity (How long code will take to execute as data increases)
- Space complexity (How much space is needed and how efficiently it is used)
What is O( 1 )?
Constant Time Complexity Examples: - setting a variable once - get a random item from an array - determining if an integer is even or odd
Basically a flat line on a graph: Easy to plan for.
What is O( log n )?
Logarithmic Time Complexity
Examples:
- binary search
- how many elements of an array are less than a value
Almost horizontal on a graph: Means that scaling large data sets won’t dramatically increase the output time.