3.5.5.2 ASCII and Unicode. Flashcards
What does ASCII stand for?
American Standard Code for Information Interchange.
What are Unicode and ASCII examples of?
Information coding systems (ICS).
How many bits does ASCII make use of and what does this mean?
ASCII makes use of 7 bits to represent 128 (=2^7) different characters including A to Z, 0 to 0 and various symbols.
Why was UNICODE introduced?
To represent a wide variety of alphabets by computers. This was necessary as the internet became more widely used throughout the world and there was a requirement for an ICS that could represent not just the Latin alphabet but also Arabic, Cyrillic, Greek and Hebrew.
How many bits does the standard Unicode use?
From anywhere between 8 to 48 bits (1 to 6 bytes) per character. This allows Unicode to represent a much wider range of different characters than ASCII.
Why can Unicode represent more characters than ASCII?
Unicode uses more bits per character than ASCII, allowing for a larger range of characters to be represented.