Communication Chapter 2 Flashcards

1
Q

Noiseless Communication scheme

A

source -> source encoder (compressor) -> noiseless channel -> source decoder (extractor) -> destination

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

random source model

A

Let p = (p₁,..,pₘ) be a discrete probability distribution. A random source S={s₁,..,sₘ} has probability distrubution p={p₁,…,pₘ} if

(1) The random source produces letters one at a time
(2) The letter sᵢ in S is produced with probability pᵢ
(3) The letter is produced independently of the past.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

H(P)=

A

H(P)= - the sum from i=1 to m of pᵢ log₂ pᵢ = H(S)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

H(S)=

A

H(S) = the sum from i=1 to m of pᵢ log (1/pᵢ) = H(P)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How to find the binary entropy function

A

sub in the values of p into the formula

H(P)= - the sum from i=1 to m of pᵢ log₂ pᵢ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

High entropy =>

A

unpredictable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Low entropy =>

A

predictable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What range can the entropy be in

A

0≤H(P)≤logm

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

When does H(P)=0

A

if p₁ = 1 and pᵢ = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

When does H(P) = logm

A

if pᵢ=1/m (i.e. equal chance for each)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Gibbs Lemma

A

Let r₁,…,rₘ be a sequence of positive real numbers that have a sum less than or equal to one. Let p = (p₁,…,pₘ) be a probability distribution. Then,
H(P)≤ the sum from i=1 to m of pᵢ log (1/rᵢ) (*)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

ENCODING

A

Takes the source and assigns each letter a codeword

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

DECODING

A

Takes one of the codewords and maps it to the associated letter

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

𝔼(ℓ(f))

A

𝔼(ℓ(f))= the sum from i=1 to m of pᵢ |f(sᵢ)|

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Shannons Noiseless Encoding Theorem

A

Let S be a random source and f:S->{0,1}* be an optimal prefix-free encoding of S. Then, H(S)≤𝔼(ℓ(f))≤H(S)+1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How to use Shannons Theorem to work out an optimal encoding

A
  • Find a prefix-free code s.t. H(S)=𝔼(ℓ(f)) (this is best possible) so must be optimal
17
Q

Just because shannons inequality holds…

A

doesn’t mean f is an optimal prefix-free encoding

18
Q

Huffman coding creates…

A

a PREFIX FREE, OPTIMAL encoding

19
Q

How to see if a code is a huffman code

A
  • check it is prefix free
  • check it is optimal (put into binary tree an check al vertices have exactly 2 children)
  • work out probabilities and run the Huffman encoding algo through
20
Q

How to work out the probability distribution from a Huffman code

A
  • draw out the binary tree for the huffman code
  • find inequalities for the probabilities based on the branching. Solve to find probabilities that satisfy these and add up to 1.
21
Q

Huffman encodings always have the lowest possible _______ amongst all _______

A

expected length, prefix-free encodings.

22
Q

a code is more efficient if….

A

it has a shorter expected length

23
Q

ℓ(f)

A

The random variable for the codeword f(s) where s is the letter produced by the source according to the probability distribution P.

24
Q

How to show an encoding is not optimal

A
  • find a encoding that has smaller expected length
25
Q

How to show an encoding is optimal

A

show it is a huffman encoding.

26
Q

while Loop in Huffmans Procedure

A

WHILE |L|≥2 DO
- pick two vertices vᵢ₁ vᵢ₂ from L with the lowest probabilities
- create a new vertex vᵢ₁ᵢ₂ and assign the porbability pᵢ₁+pᵢ₂ to it
- Add the edge vᵢ₁ᵢ₂->vᵢ₁ with label 0 and the edge vᵢ₁ᵢ₂->vᵢ₂ with label 1 to the tree
- Delete vᵢ₁ and vᵢ₂ from L
END WHILE

27
Q

Set Up of Huffman Procedure

A

Suppose S={s₁,..,sₘ} is a random source with probability distrubution p = (p₁,…,pₘ).
Let L={v₁,..,vₘ} be a se of vertices where vᵢ corresponds to sᵢ∈S.
Add the vertices of L into the tree.

28
Q

End of Huffman Procedure

A

Now L={v}. we set r=v to be the root of our tree T
Now construct the encoding from the random source to our code for all i∈[m]. We construct fₕ(sᵢ) by concatenating the labels on the edges from the path from r to vᵢ in T.