1.4.1 - Data Types Flashcards
Units of Bits
Bit
Byte
Kilobyte
Megabyte
Gigabyte
Terabyte
Data Types
Integer: Whole number
Real/Float: Decimal number
Character: Single letter/number
String: Multiple characters
Boolean: True or False
Standard Binary
128, 64, 32, 16, 8, 4, 2, 1
Negative Binary: Sign and Magnitude
- Most significant bit represents the sign
- 1 is negative, 0 is positive
- Makes calculations basically impossible
- Severely limits the size of numbers as the last bit doesn’t show value
Two’s Complement:
- Write the number in positive
- Flip each bit
- Add one
Binary Addition
0 + 0 = 0
0 + 1 = 1
1 + 1 = 10
1 + 1 + 1 = 11
Binary Subtraction
- Convert the second number into negative binary
- Add the numbers
Hexadecimal
- Used because it’s easier for humans to read and remember than binary, but easier for computers to convert to than denary
- Digits 0-9 then A-F, for a total of 0-15
Floating Point Binary
8, 4, 2, 1, 0.5, 0.25, 0.125, 0.0625
Mantissa: The number
Exponent: How much you have shifted the decimal point by
The way that this system works means that certain numbers are really hard to represent (such as 0.3), hence computers often needing separate processors to work with these long binary numbers
The same Two’s Complement system can be used for negative decimal binary numbers
Normalised floating point binary numbers have only one zero before the first one (ie. 0.1101)
To add or subtract decimal binary numbers, the exponents must be the same
Bitwise Manipulation and Masks
Bitwise manipulation: Applying logical operators to binary
Mask: The binary data that is being used
Not: Flips the data point
And: Returns 1 if both are 1
Or: Returns 1 if at least one is a 1
Xor: Returns 1 if only one is a 1
Character Sets
- How a computer represents characters as binary digits
- Includes numbers and special characters
ASCII: 127 characters in 7 bit binary
- Small, so can only really be used for standard English characters
- But a smaller bit length means that the information takes up less storage space
- Uppercase letters start at 65, and lower case starts at 97
Unicode: Uses up to 4 bytes per character
- So can represent thousands of different characters from all of the world’s languages that couldn’t fit into ASCII