Communication - Chapter 4 Flashcards

1
Q

|Bⁿₖ(x)| =

A

bⁿₖ(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Informal hamming ball

A

all the codewords with hamming distance at most k from x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

H(P)=

A

minus the sum from i=1 to m of pᵢlogpᵢ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What does shannons noisy encoding theorem tell you

A

Given a noisy channel with crossover probability p and ε a very small number the theorem says you can construct a code with a low chance of mistake and high transmission rate so long as the code is long enough.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Shannons noisy encoding theorem is…

A

best possible.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

length of a code C

A

for c∈C |c| is the length of the codewords

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

size of a code C

A

the number of codwords in a code C

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

top number of hamming ball denotion

A

length of the codeword

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

bottom number of hamming ball denotion

A

minimum hamming distance (radius of ball)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

number in brackets in hamming ball denotion

A

the centre of the hamming ball

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What does shannons noisy encoding theorem produce

A

There exists a n₀=n₀(p,ε) s.t. if n≥n₀ code of length n with probability of wrong decoding less than epsilon and R(C)≥1-H(p+ε)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Hypothesis of shannons noisy encoding theorem

A

Given any p∈[0,1/2) and ε∈(0,1/2-p]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly