Information Theory in Neuroscience Flashcards

1
Q

What is the aim of information theory ?

A

It helps to understand how to efficiently encore, transmit and decode data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Define Enthropy

A

Measure of uncertainty or unpredictability in a set of possible outcomes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Define Information

A

The reduction of uncertainty: When a message is received, it provides information it reduces the possible unknowns.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define “Compression”

A

Technique to represent information with fewer bits by eliminating redundancy.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Define “Channel capacity”

A

The maximal amount of information that can be transmitted over a communication channel without error.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Define “Error correction”

A

Methods to detect and correct errors in data transmission or storage.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

In which unity is the information calculated ?

A

In bits.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How much information is contained in a head or tail experience ?

A

1 bits (50% chance of tail, 50% chances of head, 0.5 + 0.5 = 1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Define what “bits” refers to in information theory

A

The average nbr of yes/no question required to ascertain the value of a variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Site advantages of information theory (5)

A

-Model Free (Not necessary to hypothesize a specific structure to the interactions between variables in a data set to use information theory).
-You can use any mixture of data type
-Detects linear and non linear interactions.
- Multivariate (Possess several metrics design to quantify the behavior of systels with any number of variable).
- Results in bits (facilitates straightforward comparison)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What can information theory tell you ?

A

Quantify uncertainty of one or more variables as well as the influence of one or more variables on one or more other variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the main limit of information theory analysis ?

A

Can’t produce a model that describres how a system works. It can only be used to restrict the space of possible models.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Define what is a probability distribution

A

A distribution that describes the likelihood of certain outcomes of a random variable or group of variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How is noted the probability distribution?

A

p(A)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Is a p(A) discrete or continuous ?

A

Can be both.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

In a discrete probability distribution, what must be the value of the sum of the possible states and the integral of possible values ??

A

1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

With what type of distribution can we describe a system of more than one variable?

A

A Joint probability distribution

p(c1, c2) = p(c1)p(c2)

e.g : two independent coins

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is a joint probability distribution ?

A

Describe a system with more than one variable.

p(c1, c2) = p(c1)p(c2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is a maginal probability distribution ?

A

Represents the likelihood for the outcomes of a subset of variables in the joint distribution.

it describes the probabilities of one or more variables while ignoring (or marginalizing over) the others.

20
Q

How is calculated the marginal probability distribution ?

A

Summing across certain variables in a joint distribution.

p(c1) = ∑ p(c1, c2)
(c2)

21
Q

If we calculate the probability distribution of a system with two magically linked (dependent) variables, will the distribution will be uniform ?

A

No

22
Q

What is a conditional probability distribution ?

A

It is another way to represent probabilities in system full of multiple variable. It describes the likelihood to obtain outcomes of certain variables assuming that other variables are known.

23
Q

How is noted the conditional probability?

A

It is noted as:
p(A|B) (“the probability of A given B”).

24
Q

What is Data Binning ?

A

Binning multiple observations of some variables across time or trials). It is a pre processing technique used to reduce the effect of minor observation error.

E.g : measuring voltage we could bin datas in these three categories:
>1, <-1 or ]-1, 1[.

25
Q

How is estimated the probability of a state ?

A

Total nbr of observation of that state divided by the total number of observations for all states.

s : State
N(s) : Nbr of experimental observations
Nobs: Total nbr of experimental observations

p(s) = N(s) / Nobs

26
Q

What guideline can we follow to know how many observations we perfom to adequately sample the space of possible joint states ?

A
  • > 10 observations per possible state is ideal.
27
Q

What is meant by the assumption of stationarity ?

A

The assumption that each observation contributes to a picture of the same probability distribution.

28
Q

What is discretization

A

Converting continuous data into discrete data.

29
Q

What are the two main binning procedures ?

A

Uniform width and uniform count.

30
Q

What is the principle of uniform width binning ?

A

Dividing the total range of the data into Nbins nbr of equal-WIDTH bins.

31
Q

What is the principle of uniform count binning ?

A

Divinding the total range of the data into Nbis equal-COUNT bins : probability of falling into one bin is the same as for others.

32
Q

What is parameter fishing

A

Testing parameters until statistically different. Leads to false positive and misleading conclusions. It is a form of p hacking: manipulation of analysis to get desirable fishing.

33
Q

What is a null model ?

A

Default model used in scq exp to represent what you would expect to happen if nothing special is going on. Used as comparison.

34
Q

What are Kernel-based or binless strategies

A

Other methods for handling continuous values.

35
Q

How is noted enthropy ?

A

H(x)

36
Q

What is H(X)

A

enthropy

37
Q

Does the uniform count binning procedure minimize or maximize enthropy ?

A

It maximizes the enthropy

38
Q

How many bits is a Byte ?

A

8

39
Q

What is joint enthropy ?
how is it noted ?

A

H(X, Y), enthorpy for sys with more than one variable.

40
Q

What is H(X, Y)

A

Joint enthropy

41
Q

What is conditional enthropy ? how is it noted ?

A

Quantifies the average uncertainty in a variable given the stateof another variable.
H(X | Y)

42
Q

Express what is joint enthropy in fct of enthropy and enconditional enthropy.

A

H(X, Y) = H(X) + H(Y | X)

43
Q

Explain what is this
H(X) = H(X|Y) + I(X;Y)

A

The enthropy of event X is the sum of the conditional entropy (uncertainly of a variable given the knowledge of another variable) and of the information provided by Y about X (= I(X;Y))

44
Q

What is the measure of I(X;Y) and what is it

A

bits
Information by Y about X.

45
Q

what is p(x, y) if the two variables are independant ?

A

p(x, y) = p(x)*px(y)