Quantifying information Flashcards
How is information measured?
Information is measured by unexpectedness or surprise; an unexpected event has a
low probability of occurrence.
The equation for the relationship between prob of an event and Infomation?
The equation for the measure of information?
The equation for information when there are different probabilities?
The equation for average information(Entropy)?
Entropy equation?
Define information rate?
When will the information rate be at a maximum?
when a binary signal (with equiprobable symbols) is transmitted, units bits per second (b/s)
The unit Shannon?
The same as bits per second. (b/s)
What is an anti-aliasing filter?
Describe the channel capacity?
Given an ideal low-pass channel with bandwidth B the maximum rate at which pulses
can be transmitted is 2B pulses per second. If Q equiprobable pulse levels (symbols)
are discernible at the channel output, then information can be transmitted through the
channel at a rate given by the channel capacity
The equation for channel capacity?
C = 2B log2 Q binits/s
The equation for the Shannon Hartley law?
The equation for code efficiency?
The equation for redundancy?