Finals | Statistical Analysis and Information Entropy Flashcards
Actual amount of information included in the image that can be computed
INFORMATION ENTROPY
Describes how much randomness (or uncertainty) there is in a signal or an image; in other words, how much information is provided by the signal or image.
INFORMATION ENTROPY
ENTROPY is aka
Shannon’s Entropy (1948) or Information Entropy
Use of the same number of bits to represent all pixels.
Fixed-length coding
Fixed-length coding
In this way, when a program reads an image file it knows that the first 12–bits represent the (1) of the first pixel, the next 12–bits represent the second pixel, and so on, with no need for any (2) to represent the (3) of each pixel’s (4).
- value
- special symbol
- end
- data
Uses a variable number of bits to represent pixel values.
Variable-length coding
Variable-length coding
Provides short code-words for (1) characters and long code words for (2) characters.
- frequent
- infrequent
Variable-length coding
Variability is primarily dependent on (1) of data.
- frequency of occurrence
The use of more bits that are needed to convey a given amount of information
Coding redundancy
Redundant code has a number of consequences including
bloated source code
reduced reliability
reduced maintainability
MaRe Bloated
Two major methods of variable-length encoding to reduce coding redundancy
A. Huffman coding (Huffman, 1952)
B. Arithmetic coding (Abramson, 1963)
ENCODED vs DECODED
ENCODED: Compressed image
DECODED: Compressed image that had been RECONSTRUCTED
An entropy encoding algorithm used for lossless data compression.
HUFFMAN CODING
Guarantees the uniqueness of the decoding process so that a set of codes can only represent one set of image values.
HUFFMAN CODING
This replaces each pixel value of an image with a special code on a one-to-one basis.
HUFFMAN CODING