Chapter 20 Digital data Flashcards
Why must data be converted to binary format?
So the data can be stored and understood by computers.
Convert 10011 to denary.
19
Convert 46 to binary.
101110
What is a bit?
A bit is a binary digit, it is the smallest amount of storage in a computer. A bit is a 1 or 0.
What is a nibble?
A nibble is 4 bits.
What is a byte?
A byte is 8 bits.
What is a kilobyte?
A kilobyte is 1024 bytes.
What is a megabyte?
A megabyte is 1024 kilobytes.
What is a gigabyte?
A gigabyte is 1024 megabytes.
What is a terabyte?
A terabyte is 1024 gigabytes.
What does ASCII stand for?
It stands for the American Standard Code for Information Interchange.
What was the original ASCII code table?
The original ASCII code table was 7-bit, meaning it used seven bits to represent text and there were 128 characters in the table. These characters include the letters A to Z and other common characters.
What was the second ASCII code table and what were its problems?
It was an 8-bit code table that contains 256 (2⁸) characters, making use of the 8th bit in a byte. The characters in the 8-bit ASCII table include all those in the 7-bit table and regional characters and symbols.
The 8-bit ASCII table was limited because of its ability to only represent 256 characters meaning languages with many more characters couldn’t be represented fully.
Also, the characters 128 to 255 were used differently for different regions, leading to incompatibility between character sets.
What was designed to solve the problem of ASCII code and explain it?
Unicode is a 32-bit character-encoding standard made to solve the problem of the limitations of ASCII code.
Unicode provides a unique number for every character as the current Unicode set contains over 100,000 characters.
All ASCII characters are part of Unicode and they have the same numbers as in the ASCII character set.
Characters are called ‘code points’.
What are 3 encoding methods?
UTF-32: fixed length encoding using 32 bits regardless of the character. This is inefficient when compared to ASCII, which represents a character in 1 byte.
UTF-16: A variable-length encoding system that uses a minimum of 2-byte number units per character.
UTF-8: A variable-length encoding system that uses 1 byte for the common characters. ASCII character codes are unchanged so ASCII text is also UTF-8. Some other characters are encoded with 2 or more bytes. This type of encoding is backward compatible with ASCII coding.