Communication Chapter 3 Flashcards
Symmetric p₁₀=p₀₁ =
p₁₀ = p₀₁ = p
Symmetric p₀₀ = p₁₁ =
p₀₀ = p₁₁ = 1-p
noisy channel probabilities satisfy
p₀₁+p₀₀=1=p₁₀ + p₁₁
crossover probability
the probability p which is the probability that 0 is flipped to 1 which is equal to the probability that 1 is flipped to a 0.
if p = 0 then
the channel is noiseless
if p = 1/2 then
the channel is useless (it is too noisy to decipher any of the messages)
What is the probability that precisely i errors occur with crossover probability p
(n choose i) p ^ i (1-p)^ (n-i)
i.e. the binomial distrubution
Channel Encoding map
f:A->{0,1}ⁿ
Channel decoding maps
g: {0,1}ⁿ->A such that g(f(x))=x for every x∈A.
h: {0,1}ⁿ->A such that h(y)=y for all y∈C.
length of a code
we say a code C has length n is all its codewords have length n
If a code has length n then the code is automatically…
it is automatically prefix free
A
the alphabet which we are encoding (or decoding to)
probability of wrong decoding (in words)
Pₑᵣᵣ = maximum of the sum over all y∈{0,1}ⁿ of the probability y is received given c is sent
Transmission rate formula
R(C)=log|C|/n
The higher the transmission rate…
the more efficiently the codewords are sent