Information Theory Flashcards

1
Q

Information Theory

A

the scientific study of the quantification, storage, transmission, and processing of digital information

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Entropy

A

the amount of uncertainty in the value of a random variable or the outcome of a random process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Information

A

the resolution of uncertainty

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Bit

A

binary digit (1 or 0) that is the most basic unit of knowledge

*1 = on/yes, 0 = off/no
*any communication signal can be quantified into a yes/no question

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Noise

A

anything that stands between the message and its destination and can cause distortion of the signal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Channel Capacity

A

the maximum rate of information that can be sent over a channel

*if you increase the rate of transmission, channel capacity can be exceeded

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Symbol Rate

A

the number of symbol transfers per second

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Fundamental Theorem for a Noiseless Channel

A
  1. The amount of scatter in a symbol set determines the minimum channel capacity required, under conditions of maximum efficient coding
  2. The amount of scatter in a symbol set determines the minimum efficiency of coding required to transmit a signal drawn from that symbol set over a channel, with a given maximum signal capacity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Scatter

A

how many different symbols there are in the set

e.g. binary code has a scatter of 2, the alphabet has a scatter of 26

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

According the the fundamental theorem for a noiseless channel, how is it possible to create a code that enables nearly error-free transmission

A

R (transmission rate) < C (channel capacity)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Maximum Efficiency

A

using the least number of bits to transmit information without losing any of it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What does higher entropy indicate? Lower entropy?

A

higher entropy = more uncertainty
lower entropy = more predictability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

According the the Fundamental Theorem for a discrete channel with noise, inaccuracy can be brought close to zero so long as what two conditions are met?

A
  1. The noise entropy in the channel must be less than the message entropy
    key terms: noise entropy and message entropy
  2. The sum of the source entropy + noise entropy in the channel cannot exceed the channel capacity
    key terms: source entropy, noise entropy, channel capacity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Noise Entropy

A

the amount of uncertainty introduced by the noise in the communication channel

*not a measure of the noise itself, but the uncertainty caused by the noise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Message Entropy

A

the amount of information or complexity in the actual message being sent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Source Entropy

A

the amount of uncertainty in the message being sent

17
Q

How does a discrete channel transmit information?

A

it transmits information in distinct, separate units or symbols