Communication Chapter 3 Flashcards
Symmetric p₁₀=p₀₁ =
p₁₀ = p₀₁ = p
Symmetric p₀₀ = p₁₁ =
p₀₀ = p₁₁ = 1-p
noisy channel probabilities satisfy
p₀₁+p₀₀=1=p₁₀ + p₁₁
crossover probability
the probability p which is the probability that 0 is flipped to 1 which is equal to the probability that 1 is flipped to a 0.
if p = 0 then
the channel is noiseless
if p = 1/2 then
the channel is useless (it is too noisy to decipher any of the messages)
What is the probability that precisely i errors occur with crossover probability p
(n choose i) p ^ i (1-p)^ (n-i)
i.e. the binomial distrubution
Channel Encoding map
f:A->{0,1}ⁿ
Channel decoding maps
g: {0,1}ⁿ->A such that g(f(x))=x for every x∈A.
h: {0,1}ⁿ->A such that h(y)=y for all y∈C.
length of a code
we say a code C has length n is all its codewords have length n
If a code has length n then the code is automatically…
it is automatically prefix free
A
the alphabet which we are encoding (or decoding to)
probability of wrong decoding (in words)
Pₑᵣᵣ = maximum of the sum over all y∈{0,1}ⁿ of the probability y is received given c is sent
Transmission rate formula
R(C)=log|C|/n
The higher the transmission rate…
the more efficiently the codewords are sent
Transmission rate is always less than or equal to
`1
|A|=
|C|
How to calculate the smallest code length you can achieve for a code C of size |C|
log|C|
i.e. there are 2ˣ strings of length x
if C is and [n-k] code then R(C)=
R(C)=k/n
How to check a decoding is valid
check g(f(c))=c
Hamming Distance
The Hamming Distance denoted dₕ(x,y) between x=x₁,..,xₙ∈{0,1}ⁿ and y=y₁,..,yₙ∈{0,1}ⁿ is the number of places where the two strings differ
dₕ(x,y) = |{i∈[n]: xᵢ≠yᵢ}|
metric axioms
- Positive
- Symmetric
- triangle inequality
The bigger the minimum distance…
The further codewords are away from each other which means its less likely to have errors
minimum distance decoding informal
A decoding is a minimum distance decoding if no matter what string you take x is denoted as h(x) and h(x) is the closest codeword to x
if two codewords are equally close to c in a minimum distance decoding…
choose either
How to work out a minimum distance decoding
- work out the hamming distance between the encoded word and all the codewords
- choose the codeword with the smallest hamming distance
We want Pₑᵣᵣ (C) to be…
as small as possible
How to tell if a code is t-error-detecting
if t errors are made and the result is not another codeword in C then it is t-error-detecting
How to tell is a code is t-error-correcting
If you received a codeword with t errors and then apply minimum distance decoding it is t-error-correcting if the result is the correct code
A (repetition code) string is wrongly decoded via a minimum distance decoding if…
it contains n+1/2 errors where n is the length of codewords
how many strings of length x
2ˣ
number of strings of length n that differ from x in exactly i positions
n choose i
Probability of erroneous decoding for the repetiion code
Pₑᵣᵣ (C) = P( bin(n,p)) ≥ (n+1)/2 )
Hamming Distance Equation
dₕ(x,y) = |{i∈[n]: xᵢ≠yᵢ}| = the sum from i=1 to n of |xᵢ-yᵢ|
Probability of erroneous decoding for a minimum distance decoding
maximum over all c inC of the sum over y in {0,1}^n where h(y) is not c of p to the power of the hamming distance between c and y multiplied by (1-p) to power of (n minus the hamming distance between c and y).