Information Theory Flashcards
Information Theory
the scientific study of the quantification, storage, transmission, and processing of digital information
Entropy
the amount of uncertainty in the value of a random variable or the outcome of a random process
Information
the resolution of uncertainty
Bit
binary digit (1 or 0) that is the most basic unit of knowledge
*1 = on/yes, 0 = off/no
*any communication signal can be quantified into a yes/no question
Noise
anything that stands between the message and its destination and can cause distortion of the signal
Channel Capacity
the maximum rate of information that can be sent over a channel
*if you increase the rate of transmission, channel capacity can be exceeded
Symbol Rate
the number of symbol transfers per second
Fundamental Theorem for a Noiseless Channel
- The amount of scatter in a symbol set determines the minimum channel capacity required, under conditions of maximum efficient coding
- The amount of scatter in a symbol set determines the minimum efficiency of coding required to transmit a signal drawn from that symbol set over a channel, with a given maximum signal capacity
Scatter
how many different symbols there are in the set
e.g. binary code has a scatter of 2, the alphabet has a scatter of 26
According the the fundamental theorem for a noiseless channel, how is it possible to create a code that enables nearly error-free transmission
R (transmission rate) < C (channel capacity)
Maximum Efficiency
using the least number of bits to transmit information without losing any of it
What does higher entropy indicate? Lower entropy?
higher entropy = more uncertainty
lower entropy = more predictability
According the the Fundamental Theorem for a discrete channel with noise, inaccuracy can be brought close to zero so long as what two conditions are met?
- The noise entropy in the channel must be less than the message entropy
key terms: noise entropy and message entropy - The sum of the source entropy + noise entropy in the channel cannot exceed the channel capacity
key terms: source entropy, noise entropy, channel capacity
Noise Entropy
the amount of uncertainty introduced by the noise in the communication channel
*not a measure of the noise itself, but the uncertainty caused by the noise
Message Entropy
the amount of information or complexity in the actual message being sent