Week 6 Flashcards
What is Information Theory?
It answers “how to quantify information?”
How is information quantified in Information Theory?
In terms of probability distributions and “uncertainty” (surprise), disregarding semantics.
What are the three axioms on which Shannon proposed Self-Information?
- An event with probability 100% is perfectly unsurprising, yielding no information.
- The smaller the probability, the greater the surprise/information of an event.
- If two independent events are measured separately, total info amount is the sum of the individual self-informations.
What is the equation for Self-Information?
Given random variable X:
lX(x) = -logb[PX(x)] = logb 1/PX(x)
with different bases b.
What are the different bases possible for Self-Information?
2: bits
e: nats (natural units)
10: dits
What does Entropy quantify?
The uncertainty in a random variable X.
What is Joint Entropy?
A measure of uncertainty associated with a set of variables.
What is Conditional Entropy?
Quantifies uncertainty of the outcome of a random variable Y given the outcome of another random variable X.
What is KL Divergence?
Relative entropy - quantifies the distance betwen two probability distributions.
What is Mutual Information?
Measures how much knowing one variable reduces the uncertainty about the other.