Communication - Chapter 4 Flashcards
|Bⁿₖ(x)| =
bⁿₖ(x)
Informal hamming ball
all the codewords with hamming distance at most k from x
H(P)=
minus the sum from i=1 to m of pᵢlogpᵢ
What does shannons noisy encoding theorem tell you
Given a noisy channel with crossover probability p and ε a very small number the theorem says you can construct a code with a low chance of mistake and high transmission rate so long as the code is long enough.
Shannons noisy encoding theorem is…
best possible.
length of a code C
for c∈C |c| is the length of the codewords
size of a code C
the number of codwords in a code C
top number of hamming ball denotion
length of the codeword
bottom number of hamming ball denotion
minimum hamming distance (radius of ball)
number in brackets in hamming ball denotion
the centre of the hamming ball
What does shannons noisy encoding theorem produce
There exists a n₀=n₀(p,ε) s.t. if n≥n₀ code of length n with probability of wrong decoding less than epsilon and R(C)≥1-H(p+ε)
Hypothesis of shannons noisy encoding theorem
Given any p∈[0,1/2) and ε∈(0,1/2-p]