Lecture 12: Information Theory Flashcards

1
Q

What is information?

A

Information is surprise.

Eg.. if someone tells you something unexpected you have gained information.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Information Content. Explain.

A

information content is usually expressed as the number of bits needed to communicate a message.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Information + Probabilities

A

the amount of information associated with an even (e), occurring is defined by an “probability of occurrence”
I(e) = information of an event
I(p) = information of a probability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Three properties.

A
  1. an event cannot have negative information
  2. more similar events carry less info
  3. more events provide more information
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is Entropy?

A

Is a source of stream of symbols from a finite set of symbols called the alphabet.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a Variable length code:

A

a mapping of strings of source symbols to strings, called “Code words”

1) binary
2) ASCII
3) morse

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are several desirable code properties?

A
  1. uniquely decodable - only one corresponding string

2, instantenous - only able to decode after the whole word is attained

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Codes as Trees. Explain

A
  1. A tree is a graph where node = root
  2. each node is connected to a parent node (unique)
    and also to a child node
  3. A leaf term called for no child nodes
    4.the edge connecting nodes are labelled with a different symbol in coding alphabet
How well did you know this?
1
Not at all
2
3
4
5
Perfectly