Entropy Flashcards

1
Q

Prove that the entropy of a discrete random variable X with alphabet cardinality M must satisfy the constraints 0 ≤ H(X) ≤ log M

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Definitions of code types

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Theorem (Kraft-McMillan)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Shannon’s First Theorem

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Shannon coding + 1 bit

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Prove that Huffman codes are optimal instantaneous binary codes

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

If code C’ is optimal, then code C is optimal

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Run-length coding

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Asymptotic Equipartition Property

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Typical set

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Arithmetic coding

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Scalar Quantization

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Uniform Quantizer

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Optimal Uniform Quantizer

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Lloyd - Max Quantizer

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q
A
16
Q
A