Communication chapter 3 - definitions and theorems Flashcards

1
Q

Noisy Channel

A

A (binary) noisy channel consists of a set of probabilities {p₀₀, p₀₁, p₁₀, p₁₁} where pᵢⱼ is the probability that j is received if i is sent. They must satisfy p₀₁+p₀₀=1=p₁₀ + p₁₁

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

symmetric

A

A noisy channel is symmetric if the probability of flipping a bit is p for some p∈[0,1/2]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

memoryless

A

A noisy channel is memoryless if the outcome of each transmission is independent of the outcome of previous ones

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Channel encoding scheme

A

Given a finite set A and n∈ℕ a channel encoding scheme (encoding) of A into string of length n is an injective function f:A->{0,1}ⁿ. The code C=C(f) is the image of f

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Channel decoding scheme

A

Given a finite set A and n∈ℕ a channel decoding scheme (decoding) is a map g:{0,1}ⁿ->A such that g(f(x))=x for every x∈A.
We can also consider decoding to be the map h:{0,1}ⁿ->A such that h(y)=y for all y∈C.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

block

A

A code C is called a block [n,k]-code if A={0,1}ᵏ for some k∈[n]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Probability of wrong decoding

A

For a code C of length n and a decoding h the probability of wrong decoding is defined as follows
Pₑᵣᵣ (c,h) = max (sum of all y∈{0,1}ⁿ(P y is recieved | C is sent))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Transmission rate

A

The transmission rate of a code C of length n is R(C)=log|C|/n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Hamming Distance

A

The Hamming Distance denoted dₕ(x,y) between x=x₁,..,xₙ∈{0,1}ⁿ and y=y₁,..,yₙ∈{0,1}ⁿ is the number of places where the two strings differ
dₕ(x,y) = |{i∈[n]: xᵢ≠yᵢ}|

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

minimum distance

A

The minimum distance of a code C, denoted by dₕ(C) is the minimum distance amoung all the pairs of codewords
dₕ(C)=min dₕ(c,c’)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

minimum distance decoding

A

A decoding h:{0,1}ⁿ-> C is a minimum distance decoding for C if for every x∈{0,1}ⁿ we have dₕ(x,h(x)) = min dₕ(x,c)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

probability of wrong decoding for a minimum distance decoding

A

Let C be a code of length n and h a minimum distance decoding for C. The probability of wrong decoding for C is
Pₑᵣᵣ (C) = max (sum of all y∈{0,1}ⁿ( pᵈₕ⁽ˣ,ᶜ⁾ (1-p)ⁿ⁻ᵈₕ⁽ˣ,ᶜ⁾)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

t-error-detecting

A

A code is t-error-detecting if a minimum distance decoding can detect up to t errors. That is, if t errors occur then one can identity an error has been made.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

t-error-correcting

A

A code is t-error-correcting if a minimum distance decoding can correct up to t-errors. That is, if up to t-errors are made then one can still identify the correct codeword which was sent.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

proposition. t-error detcing./correcting and minimum distance.

A

A code C of length n is

(i) t-error-detecting iff dₕ(C)≥t+1
(ii) t-error-correcting iff dₕ(C)≥2t+1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly