Communication Chapter 3 Flashcards

1
Q

Symmetric p₁₀=p₀₁ =

A

p₁₀ = p₀₁ = p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Symmetric p₀₀ = p₁₁ =

A

p₀₀ = p₁₁ = 1-p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

noisy channel probabilities satisfy

A

p₀₁+p₀₀=1=p₁₀ + p₁₁

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

crossover probability

A

the probability p which is the probability that 0 is flipped to 1 which is equal to the probability that 1 is flipped to a 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

if p = 0 then

A

the channel is noiseless

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

if p = 1/2 then

A

the channel is useless (it is too noisy to decipher any of the messages)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the probability that precisely i errors occur with crossover probability p

A

(n choose i) p ^ i (1-p)^ (n-i)

i.e. the binomial distrubution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Channel Encoding map

A

f:A->{0,1}ⁿ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Channel decoding maps

A

g: {0,1}ⁿ->A such that g(f(x))=x for every x∈A.
h: {0,1}ⁿ->A such that h(y)=y for all y∈C.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

length of a code

A

we say a code C has length n is all its codewords have length n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

If a code has length n then the code is automatically…

A

it is automatically prefix free

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

A

A

the alphabet which we are encoding (or decoding to)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

probability of wrong decoding (in words)

A

Pₑᵣᵣ = maximum of the sum over all y∈{0,1}ⁿ of the probability y is received given c is sent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Transmission rate formula

A

R(C)=log|C|/n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

The higher the transmission rate…

A

the more efficiently the codewords are sent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Transmission rate is always less than or equal to

A

`1

17
Q

|A|=

A

|C|

18
Q

How to calculate the smallest code length you can achieve for a code C of size |C|

A

log|C|

i.e. there are 2ˣ strings of length x

19
Q

if C is and [n-k] code then R(C)=

A

R(C)=k/n

20
Q

How to check a decoding is valid

A

check g(f(c))=c

21
Q

Hamming Distance

A

The Hamming Distance denoted dₕ(x,y) between x=x₁,..,xₙ∈{0,1}ⁿ and y=y₁,..,yₙ∈{0,1}ⁿ is the number of places where the two strings differ
dₕ(x,y) = |{i∈[n]: xᵢ≠yᵢ}|

22
Q

metric axioms

A
  1. Positive
  2. Symmetric
  3. triangle inequality
23
Q

The bigger the minimum distance…

A

The further codewords are away from each other which means its less likely to have errors

24
Q

minimum distance decoding informal

A

A decoding is a minimum distance decoding if no matter what string you take x is denoted as h(x) and h(x) is the closest codeword to x

25
Q

if two codewords are equally close to c in a minimum distance decoding…

A

choose either

26
Q

How to work out a minimum distance decoding

A
  • work out the hamming distance between the encoded word and all the codewords
  • choose the codeword with the smallest hamming distance
27
Q

We want Pₑᵣᵣ (C) to be…

A

as small as possible

28
Q

How to tell if a code is t-error-detecting

A

if t errors are made and the result is not another codeword in C then it is t-error-detecting

29
Q

How to tell is a code is t-error-correcting

A

If you received a codeword with t errors and then apply minimum distance decoding it is t-error-correcting if the result is the correct code

30
Q

A (repetition code) string is wrongly decoded via a minimum distance decoding if…

A

it contains n+1/2 errors where n is the length of codewords

31
Q

how many strings of length x

A

32
Q

number of strings of length n that differ from x in exactly i positions

A

n choose i

33
Q

Probability of erroneous decoding for the repetiion code

A

Pₑᵣᵣ (C) = P( bin(n,p)) ≥ (n+1)/2 )

34
Q

Hamming Distance Equation

A

dₕ(x,y) = |{i∈[n]: xᵢ≠yᵢ}| = the sum from i=1 to n of |xᵢ-yᵢ|

35
Q

Probability of erroneous decoding for a minimum distance decoding

A

maximum over all c inC of the sum over y in {0,1}^n where h(y) is not c of p to the power of the hamming distance between c and y multiplied by (1-p) to power of (n minus the hamming distance between c and y).