Lecture 2: Shannon Codes Flashcards

1
Q

Easy way to choose code lengths to meet Kraft inequality?

A

If the probability distribution has all probabilities in the form of 2^(-lk), ie, if symbol 1 is 2^(-2) likely (1/4), it should be given a codeword length of 2.

At the end, it is complete if all the probabilities add up to 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How to deal with probabilities that are not negative integer powers (2^-n)? What does this mean for compression?

A

Ceiling round: length lk = Ceiling(log1 / pk)
Easiest (most efficient) to compress if codeword probabilities are 1/2, 1/4 etc… (log1/pk are ints)

If we don’t have nice probabilities, we will have to lower compression to ceiling round the bits per symbol.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a Shannon code?

A

A prefix code where each code word length is computed as the ceiling of log(1/pk), the log of 1 over the probability of it occurring.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the compression bounds for a Shannon code?

A

H(X) <= L(C,X) <= H(X) + 1 (Expected Length is within 1 bit of Shannon entropy)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the expected length of a code?

A

L(C,X) = Sum(pklk) = Sum(pkCeiling(log2(1/pk)))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How do you prove compression bounds of Shannon code?

A

FILL IN!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Is the Shannon code optimal?

A

No, if the codewords have non-integer shannon information contents (log1/pk), compression optimality is lot.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the source coding theorem?

A

FILL IN

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How can we reduce the upper bound compression limit on Shannon codes?

A

By block encoding the codewords, we can reduce the extra bit of information (from the shannon entropy) across many letters, towards the limit of H(X).

For n input symbols in a block, H(X) <= L(C, Xn) <= H(X) + 1/n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What to do if the probability distribution was wrong (was a guess)?

A

Run the numbers with both the incorrect and correct distributions to get L(C,X) for both. D(p || q) will be the difference between them. Can also do Sum(log2(P(x)*(P(x)/Q(x))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How to prove the Difference between two probability distributions is H(x) + D(p||q)?`

A

FILL IN

How well did you know this?
1
Not at all
2
3
4
5
Perfectly