Finals | Statistical Analysis and Information Entropy Flashcards
Actual amount of information included in the image that can be computed
INFORMATION ENTROPY
Describes how much randomness (or uncertainty) there is in a signal or an image; in other words, how much information is provided by the signal or image.
INFORMATION ENTROPY
ENTROPY is aka
Shannon’s Entropy (1948) or Information Entropy
Use of the same number of bits to represent all pixels.
Fixed-length coding
Fixed-length coding
In this way, when a program reads an image file it knows that the first 12–bits represent the (1) of the first pixel, the next 12–bits represent the second pixel, and so on, with no need for any (2) to represent the (3) of each pixel’s (4).
- value
- special symbol
- end
- data
Uses a variable number of bits to represent pixel values.
Variable-length coding
Variable-length coding
Provides short code-words for (1) characters and long code words for (2) characters.
- frequent
- infrequent
Variable-length coding
Variability is primarily dependent on (1) of data.
- frequency of occurrence
The use of more bits that are needed to convey a given amount of information
Coding redundancy
Redundant code has a number of consequences including
bloated source code
reduced reliability
reduced maintainability
MaRe Bloated
Two major methods of variable-length encoding to reduce coding redundancy
A. Huffman coding (Huffman, 1952)
B. Arithmetic coding (Abramson, 1963)
ENCODED vs DECODED
ENCODED: Compressed image
DECODED: Compressed image that had been RECONSTRUCTED
An entropy encoding algorithm used for lossless data compression.
HUFFMAN CODING
Guarantees the uniqueness of the decoding process so that a set of codes can only represent one set of image values.
HUFFMAN CODING
This replaces each pixel value of an image with a special code on a one-to-one basis.
HUFFMAN CODING
This replaces each pixel value of an image with a one code.
ARITHMETIC CODING
The main idea behind this coding is to assign each symbol an interval.
ARITHMETIC CODING
It consists of a few arithmetic operations due to its complexity is less. In terms of complexity, this is asymptotically better than the other coding.
ARITHMETIC CODING
ARITHMETIC vs HUFFMAN (statistical method?)
ARITHMETIC: Not a statistical method
HUFFMAN: Is a statistical method
ARITHMETIC vs HUFFMAN (Result?)
ARITHMETIC: Yields an optimum result
HUFFMAN: Does not yield an optimum result
ARITHMETIC vs HUFFMAN (1 to 1 correspondence)
ARITHMETIC: No one-to-one correspondence b/n source symbol and code word
HUFFMAN: There is one-to-one correspondence b/n source symbol and code word
ARITHMETIC vs HUFFMAN (example?)
ARITHMETIC: If a,b,c are messages, then only one unique code to entire message
HUFFMAN: If a,b,c are messages, then separate sybols are assigned
Fourier transform pair
two functions f (x, y) and F(u, v)
Enable the transformation of a two-dimensional image from the spatial domain to the frequency domain, and vice versa.
The Fourier and the inverse transform