Basic Info Theory Flashcards
1
Q
What is the amount of information yielded (entropy) by the outcome of a future event if you already know what the outcome will be?
A
0
2
Q
The information content of an event E is what?
A
A function that increases as the probability p(E) of a function decreases
3
Q
If p(E) is close to zero then what do we know about the information content of that event?
A
It is high. We learned a lot from it [“wow that was super surprising. We need to update our models about the world a LOT”]
4
Q
The smaller the cross entropy the ______ two probability distributions are
A
more similar
5
Q
what is a discrete probability distribution?
A
a vector whose elements lie in 0…1 and that sum to 1