Advanced Statistical Physics Flashcards

1
Q

What are the axioms of probability theory?

A

The probability of an even happening is real and non-negative.

The probability of something happening is 1.

The probability of one of multiple events happening is equal to the sum of their individual properties (mutually exclusive events).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is P(AuB)

A

P(A) + P(B) - P(AnB)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is conditional probability P(A|B)

A

P(AnB) / P(B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is Bayes theorem for P(A|B)

A

P(B|A) P(A) / P(B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What properties do the probabilities of independent events A and B have?

A

P(AnB) = P(A)P(B)
P(A|B) = P(A)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the law of total probability of events?

A

P(A) = sum[ P(A|B_i) P(B_i) ]
( mutually exclusive events B_i )
( the union of all events B_i is a complete set)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the definition of the cumulative distribution F(x)?
How is the p.d.f. related to the cumulative distribution?

A

The integral of the p.d.f. from -inf to x

The p.d.f. is the derivative w.r.t. x of the cumulative distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the expression for variance in terms of moments?

A

2nd moment - first moment^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the characteristic function of a distribution?
How can we recover the p.d.f.?

A

Expectation value of e^(ikx)

p.d.f. = 1/2pi integral[ characteristic function e^(-ikx)]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How can all moments be generated from the characteristic function?

A

Taylor expand e^(ikx) to express in terms of moments.
Recover moments by differentiating w.r.t. k
*there is a (-i)^n term
*evaluated at k=0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the binomial theorem?

A

check wall notes

WHERE CAN WE ALSO APPLY THIS?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the formula for a geometric series?

A

Check wall notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How can we calculate the marginal of a multivariate p.d.f.?

A

Integrate w.r.t. the variables we are not interested in.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What property do multivariate p.d.f’s have for independent variables?

A

The p.d.f. is factorisable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

How is the covariance of two variables defined?

What about correlation?

A

<XY> - <X><Y>

cor = cov / sig_x sig_y
</Y></X></XY>

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What are the range of values possible for correlation?

A

-1 < < 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is the convolution theorem for a variable z that is the sum of variables x_i?

A

The p.d.f. for z is the convolution of p.d.f’s for variables x_i.

The characteristic function for z is the product of characteristic functions for x_i.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the law of large numbers?

How can we prove it?

A

For x_i i.i.d. random variables
Z = 1/N sum[x_i] –> mean of x_i distribution

Expand characteristic function for X_i
Use convolution
Use limit form of e
Inverse FT

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is the central limit theorem?

A

For x_i i.i.d. random variables
Z = 1/N sum[x_i]
N –> very large

p.d.f. for Z approaches gaussian with mean mu and variance equal to var(x)/N

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What is the form of the gaussian distribution?

A

1/sqrt(2pi var) * e^ - [(x - mu)^2 / 2 var^2]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What are the properties of a stochastic matrix?

A

Elements are transition probabilities q_fi (nonzero)
Columns sum to 1 (preserves normalisation)
Has a unit eigenvalue corresponding to the stationary state. (all other eigenvalues magnitude less than 1)

22
Q

What is the condition for detailed balance?
What is an equivalent property?

A

q_mn P_n = q_nm P_m
Equivalent to reversibility of a markov process

23
Q

Give an example of how Markov chain Monte Carlo works.

A

e.g. ising model average energy is sum over all states hamiltonian * boltzmann distribution.

-Construct markov chain for states generating boltzmann stationary state.
-Apply markov chain and average energy as sum of resulting states hamiltonian (/N)
-Many less terms required in sum.

24
Q

In the example of the Metropolis-Hastings Algorithm, how is a markov chain constructed with a boltzmann stationary state?

A

Using detailed balance condition with boltzmann state distribution probabilities.
-Can ensure this condition by choosing transition probabilities.
min[1, Px’ / Px]

The state x’ is state x with random spin flipped

25
Q

What is the boltzmann distribution?

A

1/Z e^(-beta * energy)

Z is sum of e term over all states

energy is hamiltonian on state for ising model.

26
Q

What is the master equation?

A

Rate of change of P_n =
rate of probability influx to state n
- rate of probability outflow from state n

27
Q

What is the poisson distribution?

A

Check wall notes

28
Q

What property shows diffusion?

A

Variance proportional to time

29
Q

What is the Gillespe Algorithm?
(continuous time Monte-Carlo)

A

Increment by random exp. distributed time.
Choose next state using random number (order states via sum of transition rates)

*check notes

30
Q

What is the general master equation for a non-linear birth-death process?

A

rate of change of <n> =
<W+> - <W-></W-></n>

31
Q

What is the deterministic approximation?

A

Instead of considering <transitionrates>
Consider transition rates(<n>)</n></transitionrates>

-valid only for small fluctuations around <n></n>

32
Q

What is the generating function?

A

F = sum_n [ z^n P_n]

-can taylor expand in z an expression for F obtained via a DE
-compare to sum expression to extract P_n

*simply the first and second derivatives of F w.r.t. z are usually enough.

33
Q

How can we express a conditional probability for a continuous time+space process?

A

P(point | previous path) =
P(whole path) / P(previous path)

34
Q

How can we express the Markovian property for a continuous time+space process?

A

P(point | previous path) =
P(point | previous point only)

35
Q

What is the Lindeberg condition for continuity?

A

Check notes

36
Q

What does the differential Chapman-Kolmogorov equation describe and what terms are involved?
How do we get to the Fokker-Planck equation?

A

Describes time evolution of prob(point | initial point)
-drift term
-diffusion term
-jump term

*assuming no jumps (satisfy Lindeberg condition) gives the Fokker-Planck equation.

37
Q

What is the Fokker-Planck equation?

A

Check wall notes

38
Q

What is the condition on F-P equation solutions given infinite bounds?

A

f(x, t) -> 0 at infinite bounds.

39
Q

What is the form of probability flux for the F-P equation?

A

Consider the continuity equation:
partial_t (f) + partial_x (J) = 0

40
Q

What are absorbing b.c’s for the F-P equation?

A

f = 0 at both bounds

41
Q

What are reflecting b.c’s for the F-P equation?

A

J(a, t) = J(b, t) = 0

42
Q

What are periodic b.c’s for the F-P equation?

A

f same at both bounds
J same at both bounds

43
Q

What is the Weiner process?

A

A Markovian process described by the F-P equation with zero drift and constant diffusion.

44
Q

How can we solve for the Weiner process?

What do we get?

A

Use the characteristic function.

Gives gaussian with mean equal to initial point and diffusive variance.

45
Q

What does the non-differentiability of sample paths refer to for the Weiner process?

A

The Weiner process is continuous but does not have a derivative.
This can be shown by considering the fact that:
<change in W ^2> = delta t

46
Q

What are the general concepts of the Langevin Approach to stochastic processes?

A

Consider path realisations instead of conditional probabilities.
-write a DE including random force term
–>this term has no memory and is considered white noise (differential of the weiner process)

47
Q

What is the issue with the differential Langevin approach?

A

Weiner process derivative does not exist: use an integral equation instead.

–> we now need to define stochastic integration as we have an integral w.r.t. the Weiner process.

48
Q

How can we convert between a Langevin-Ito stochastic DE and the FP equation?

A

Check wall notes

49
Q

What is the formula for Gaussian integral:

inf limits: e^(-ax^2 +bx)

A

Check wall notes

50
Q

How can we approach a polynomial ito stochastic integral?

A

Consider d(W^n) as [W + delW]^n (binomial expansion)

Keep (dW)^2 = dt and dW terms

Rearrange to get integral w.r.t. dW in terms of integrals w.r.t. d(W^n) and dt

51
Q

How can we solve growing network problems?

A

Consider the probability of a new node being connected to a node of degree k.

Then consider the increase/decrease of nodes of degree k at each time step. (separate expression needed for k=1 case)

In the stationary limit, the probabilities do not change over timesteps, use this to solve for the stationary probabilities.

52
Q

What are the key steps in solving fixation/extinction problems?

A

Consider in terms of probabilities that fixation is reached given a state with i individuals.
Construct an equation for relating these probabilities for neighbouring i.

Solve this equation using y as the difference between these neighbouring probabilities.
Also define gamma as ratio of transition rates.
Consider the complete sum of y’s.