Week 6 Flashcards

1
Q

What is Information Theory?

A

It answers “how to quantify information?”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How is information quantified in Information Theory?

A

In terms of probability distributions and “uncertainty” (surprise), disregarding semantics.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the three axioms on which Shannon proposed Self-Information?

A
  • An event with probability 100% is perfectly unsurprising, yielding no information.
  • The smaller the probability, the greater the surprise/information of an event.
  • If two independent events are measured separately, total info amount is the sum of the individual self-informations.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the equation for Self-Information?

A

Given random variable X:

lX(x) = -logb[PX(x)] = logb 1/PX(x)

with different bases b.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are the different bases possible for Self-Information?

A

2: bits
e: nats (natural units)
10: dits

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What does Entropy quantify?

A

The uncertainty in a random variable X.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is Joint Entropy?

A

A measure of uncertainty associated with a set of variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is Conditional Entropy?

A

Quantifies uncertainty of the outcome of a random variable Y given the outcome of another random variable X.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is KL Divergence?

A

Relative entropy - quantifies the distance betwen two probability distributions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is Mutual Information?

A

Measures how much knowing one variable reduces the uncertainty about the other.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly