Fundamentals Flashcards

1
Q

Memory 1

A
  • Consists of slots that are bounded between them
  • Programs store values in back-to-back slots in binary format. Each slot has 1 byte (8 bits) capacity. The most significant byte is added to the left
  • Integers are fixed-width integers, whether of any size. So that amount of memory is reserved
  • Back-to-back storage means that the memory reserved for a value needs to be continuous. Those values can’t be stored in separate slots
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Memory 2

A
  • Collections are also stored in back-to-back free memory slots
  • Strings are a collection of characters. Each character is mapped into a number, so they are stored as a list of numbers
  • Pointers consist of storing the memory address of another memory slot
  • Computers can access memory slots, given memory addresses, very quickly. It’s an inexpensive operation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Complexity Analysis

A
  • Time complexity: the measure of how fast an algorithm runs, as the size of the input increases
  • Space complexity: the measure of how much memory an algorithm uses
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Big O Notation

A
  • Within the software industry, it’s the worst-case complexity scenario. So, Omega and Theta notations are not considered
  • In some cases, worst-case complexity might differ from average-case complexity

Asymptotic analysis:
- Study of a function’s behavior when the input size tends to infinity
- Elementary operations are not considered if they don’t have a relation with the input size because they remain constant in time. For example, O(8) is equal to O(1)
- Much bigger Big O notations are considered over smaller Big O notations. For example, O(N^2 + N + 1) is equal to O(N^2)
- When two or more input sizes are different, every input must be considered in the notation. For example O(N! + log(M) + 2N + 3) is equal to O(N! + log(M))
- Even doing any operation on half or third part of any data structure is the same as doing it in all the data structure. For example, O(0.33N) is equal to O(N)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Time Complexity

A
  • O(1): constant. As the size of the input increases the speed of the algorithm remains constant
  • O(log(N)): logarithmic
  • O(N): linear. As the size of the input increases the speed of the algorithm increases linearly
  • O(N * log(N)): logarithmic linear
  • O(N^2), O(N^3): quadratic, cubic
  • O(2^N): exponential
  • O(N!): factorial
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Space Complexity

A
  • O(1): constant. As the size of the input increases the algorithm’s memory remains constant
  • O(log(N)): logarithmic
  • O(N): linear. As the size of the input increases the algorithm’s memory increases linearly
  • O(N * log(N)): logarithmic linear
  • O(N^2), O(N^3): quadratic, cubic
  • O(2^N): exponential
  • O(N!): factorial
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Logarithm

A
  • log b (x) = y, if and only if: b^y = x
  • In computer science, ‘log’ is assumed on base ‘2’, so ‘log 2 (N) = y’, if and only if ‘2^y = N’
  • In the equation ‘2^? = N’, the more ‘N’ is doubled, the value of ‘?’ is increased by a tiny amount. So ‘log(N)’ increases only by a tiny amount when ‘N’ increases
  • Logarithmic complexity is much better than linear complexity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly