Communication Chapter 2 - definitions and theorems Flashcards

1
Q

source alphabet

A

The source alphabet S is a finite set of symbols, known as letters, which we use to write our message.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

entropy

A

Given a probability distribution p = (p₁,…,pₘ) the binary entropy is defined as H(P)= - the sum from i=1 to m of pᵢ log₂ pᵢ.
Given a random source S with probability distrubution p we define the entropy of S as H(S)=H(P)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Deterministic

A

you know how a function behaves, i.e. the function is predictable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

H(P)

A

H(P)= - the sum from i=1 to m of pᵢ log₂ pᵢ.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Gibbs lemma

A

Let r₁,…,rₘ be a sequence of positive real numbers that have a sum less than or equal to one. Let p = (p₁,…,pₘ) be a probability distribution. Then,
H(P)≤ the sum from i=1 to m of pᵢ log (1/rᵢ) (*)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Theorem. 0≤H(P)≤logm.

A

let p = (p₁,…,pₘ) be a probability distribution. Then, 0≤H(P)≤logm.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Source Encoding Scheme

A

A Source Encoding Scheme (or simply encoding) of S is an injective map f:S->{0,1}* i.e. a map that assigns each letter of S a unique string in {0,1}*.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Code associated to an encoding

A

The code associated to the encoding is the image of f in {0,1}* and is denoted by C=C(f).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

prefix-free encoding

A

The encoding is prefix-free if the associated code is prefix-free.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

codeword of S

A

For every s∈S f(s) is the codeword of s

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Source decoding Scheme

A

A source decoding scheme (or simply decoding) is a map g:{0,1}*->S s.t. s=g(f(s)) for all s∈S

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Expected length of f

A

Suppose S={s₁,..,sₘ} is a random source with probability distrubution p = (p₁,…,pₘ) and an encoding f:S->{0,1}*. Let ℓ(f) be the random variable for the codeword f(s) where s is the letter produced by the source according to the probability distrubution p. Then the expected length of f denoted 𝔼(ℓ(f)) is defined as the expected value of ℓ(f). That is,
𝔼(ℓ(f))= the sum from i=1 to m of pᵢ |f(sᵢ)|

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

optimal

A

Given a random source S a prefix free encoding f of S is optimal if it minimises he value of 𝔼(ℓ(f)) amongst all prefix-free encodings of S.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Shannons Noiseless Encoding Theorem

A

Let S be a random source and f:S->{0,1}* be an optimal prefix-free encoding of S. Then, H(S)≤𝔼(ℓ(f))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Huffman encoding procedure

A

Suppose S={s₁,..,sₘ} is a random source with probability distrubution p = (p₁,…,pₘ).
Let L={v₁,..,vₘ} be a se of vertices where vᵢ corresponds to sᵢ∈S.
Add the vertices of L into the tree.
WHILE |L|≥2 DO
- pick two vertices vᵢ₁ vᵢ₂ from L with the lowest probabilities
- create a new vertex vᵢ₁ᵢ₂ and assign the porbability pᵢ₁+pᵢ₂ to it
- Add the edge vᵢ₁ᵢ₂->vᵢ₁ with label 0 and the edge vᵢ₁ᵢ₂->vᵢ₂ with label 1 to the tree
- Delete vᵢ₁ and vᵢ₂ from L
END WHILE
Now L={v}. we set r=v to be the root of our tree T
Now construct the encoding from the random source to our code for all i∈[m]. We construct fₕ(sᵢ) by concatenating the labels on the edges from the path from r to vᵢ in T.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Theorem. Huffman Algorithm.

A

Let S={s₁,..,sₘ} is a random source with probability distribution p = (p₁,…,pₘ). The huffman encoding fₕ is an optimal prefix-free encoding of S; that is, for all prefix-free encodings f of S we have 𝔼(ℓ(fₕ))≤𝔼(ℓ(f))

17
Q

Proposition. For Huffmans proof.

A

Let S={s₁,..,sₘ} is a random source with probability distribution p = (p₁,…,pₘ) and suppose that p₁≥…≥pₘ. Then there exists an optimal prefix-free encoding f* of S with ℓᵢ = |f*(sᵢ)| satisfying:

(a) ℓ₁≤…≤ℓₘ
b) ℓₘ = ℓₘ₋₁
(c) f(sₘ₋₁) and f(sₘ) only differ in their last bit