Basic Info Theory Flashcards

1
Q

What is the amount of information yielded (entropy) by the outcome of a future event if you already know what the outcome will be?

A

0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

The information content of an event E is what?

A

A function that increases as the probability p(E) of a function decreases

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

If p(E) is close to zero then what do we know about the information content of that event?

A

It is high. We learned a lot from it [“wow that was super surprising. We need to update our models about the world a LOT”]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

The smaller the cross entropy the ______ two probability distributions are

A

more similar

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

what is a discrete probability distribution?

A

a vector whose elements lie in 0…1 and that sum to 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly