9 probability of error correction detection Flashcards

1
Q

considering probabilities with coset decoding: intro

A

Suppose we transmit a linear code down a symmetric channel with
symbol error probability p, then use coset decoding. Then we get the decoding of any received word y right if and only if our error correction is right. This happens in coset decoding if and only if the actual error e = y − x is a coset leader.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

considering probabilities with coset decoding:
probabilities example

A

P꜀ₒᵣᵣ(x) =
Prob(error e = one of the coset leaders )
independent of x

P꜀ₒᵣᵣ(x) =
P(e=0000) )+P(e=1000)+P(e=0100)+P(e = 0010)
= (1 − p)⁴+ 3p(1 − p)³

Pₑᵣᵣ(x)= 1- P꜀ₒᵣᵣ(x)
it depends on C not on x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

word error rate of a code

A

Pₑᵣᵣ(x)= 1- P꜀ₒᵣᵣ(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

considering probabilities with coset decoding:
probabilities formula

A

Let C be any q-ary linear code whose coset leaders are a₀ = 00..0, a₁, a₂, …, a_l
We have

P꜀ₒᵣᵣ(x) = ΣP(e=aᵣ) for r=0,1,…l
=Σ (p/[q-1])ʷ⁽ᵃ-ʳ⁾ * (1-p)ⁿ⁻ʷ⁽ᵃ-ʳ⁾
for r=0,1,…l

=Σ γₛ(p/[q-1])ˢ * (1-p)ⁿ⁻ˢ
for s=0,1,…,n

where γₛ is the number of coset leaders of weight s, i.e. the number of cosets whose minimum weight is s.

(we have w(e)=#errors, we have q-1 choices of digit errors, each a prob p so transmission of incorrect digit is p/(q-1) )
#cosets used

(w(aᵣ) is coset leader weight, ie number of error digits)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

cosets

A

as they form a partition,
q^n/q^k = qⁿ⁻ᵏ
=#vectors in Fⁿ⁻ᵏ _q

in total

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

γₛ
thm 9.1

A

*If d(C) ≥ 2t + 1, then every vector of weight ≤ t is unique of minimal weight in its coset and therefore a coset leader for C.
Hence

γₛ = nCs (q-1)ˢ
=|Sₛ(00…0)|

for 0 ≤ s ≤ t

*If C is a perfect [n, k, 2t + 1]-code then its coset leaders are precisely the vectors of weight ≤ t.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

thm for coset leaders proof

A

Proof: (i). Consider the vectors in Bt(00..0). If y ∈ Bt(00..0) is
not unique of minimal weight in its coset, then there exists z with
w(z) ≤ w(y) and x ∈ C (x ̸= 0) such that
y = z + x
But then
d(C) ≤ w(x) = w(y − z) = d(y, z) ≤ d(y, 0) + d(0, z)
= w(y) + w(z) ≤ 2w(y) ≤ 2t
This contradicts the first hypothesis, so y is a coset leader

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

9.2. Example. Let C be a 3-ary [11, 6, 5]-code. What is P_err(C)?

A

By thm 9.1
d = 5 implies all vectors of weight ≤ 2 are coset
leaders. Therefore
γ₀ = 1
γ₁ =11C1* 2 = 22
γ₂ =11C2 *2² = 220

(For w > 2, what is γw ? We don’t know…
Therefore

Pcorr(C) =
Σᵥᵥ γᵥᵥ (p/{q − 1})ʷ(1 − p)ⁿ⁻ʷ
(w=0,..n)

Σᵥᵥ γᵥᵥ (p/2)ʷ(1 − p)ⁿ⁻ʷ
(w=0,1,2)

= (1 − p)¹¹ + 22(p/2)(1 − p)¹⁰ + 220(p/2)²(1 − p)⁹
= α
so Perr(C) = 1 − Pcorr(C) ≤ 1 − α

In fact we have equality, because this code is perfect. We know the weights of 1+22+220=243 of the coset leaders. But the number of cosets is |Fⁿ_q|/|C|
= qⁿ/q=3¹¹/3⁶=3⁵= 243
so there are no more cosets

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

considering probabilities with coset decoding:
If we use C for error detection, rather than error correction, the
analogue of P_err(C)
Pᵤₙₔₑₜₑ꜀

A

P_undetec(c)

the probability that a
word is received with undetected errors

we transmit x ∈ C and receive y ∈ F_q ^n
. The received vector has undetected errors iff y ̸= x but y ∈ C. That is, iff
e ∈ C \ {00..0}

The probability of this is again independent of x:
P_undetec(c)
=
Σ_w
δᵥᵥ (p/{q-1})ʷ (1-p)ⁿ⁻ʷ

for w=1,…n

δᵥᵥ is the number of codewords weight w

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

9.3. Example. C = {0000, 1011, 0101, 1110} over Z_2

P_undetec ?

A

δ₁ = 0,
δ₂ = 1,
δ₃ = 2,
δᵥᵥ = 0
(w ≥ 4). Thus

P_undetec (C)
= 0.p(1 − p)³+ 1.p²(1 − p)² + 2.p³(1 − p)
= p²(1 − p)(1 − p + 2p)
= p²(1 − p²)
= 0.00009999 if p = 0.01

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

p_retrans

A

If y is received and y ̸∈ C we detect an error and request retransmission. How likely is this?
Pretrans = 1 − P(no error detected)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

9.3. Example. C = {0000, 1011, 0101, 1110} over Z_2

P_retrans?

A

= 1 − (P(no errors) + P(error occurs but is not detected))
= 1 − (1 − p)^n − P_undetec (C)

In our example
P_retrans (C) = 0.039304 if p = 0.01
which is about 4%

How well did you know this?
1
Not at all
2
3
4
5
Perfectly