Information Theory in Neuroscience Flashcards
What is the aim of information theory ?
It helps to understand how to efficiently encore, transmit and decode data.
Define Enthropy
Measure of uncertainty or unpredictability in a set of possible outcomes
Define Information
The reduction of uncertainty: When a message is received, it provides information it reduces the possible unknowns.
Define “Compression”
Technique to represent information with fewer bits by eliminating redundancy.
Define “Channel capacity”
The maximal amount of information that can be transmitted over a communication channel without error.
Define “Error correction”
Methods to detect and correct errors in data transmission or storage.
In which unity is the information calculated ?
In bits.
How much information is contained in a head or tail experience ?
1 bits (50% chance of tail, 50% chances of head, 0.5 + 0.5 = 1)
Define what “bits” refers to in information theory
The average nbr of yes/no question required to ascertain the value of a variable.
Site advantages of information theory (5)
-Model Free (Not necessary to hypothesize a specific structure to the interactions between variables in a data set to use information theory).
-You can use any mixture of data type
-Detects linear and non linear interactions.
- Multivariate (Possess several metrics design to quantify the behavior of systels with any number of variable).
- Results in bits (facilitates straightforward comparison)
What can information theory tell you ?
Quantify uncertainty of one or more variables as well as the influence of one or more variables on one or more other variables.
What is the main limit of information theory analysis ?
Can’t produce a model that describres how a system works. It can only be used to restrict the space of possible models.
Define what is a probability distribution
A distribution that describes the likelihood of certain outcomes of a random variable or group of variable.
How is noted the probability distribution?
p(A)
Is a p(A) discrete or continuous ?
Can be both.
In a discrete probability distribution, what must be the value of the sum of the possible states and the integral of possible values ??
1
With what type of distribution can we describe a system of more than one variable?
A Joint probability distribution
p(c1, c2) = p(c1)p(c2)
e.g : two independent coins
What is a joint probability distribution ?
Describe a system with more than one variable.
p(c1, c2) = p(c1)p(c2)