Week 2 Flashcards
Hastings-Metropolis Algorithm
- Pick a k∈S
- Let n=0 and X(0)=k
- Generate random X such that P(X=j) = q(X(n), j) and a U~U(0,1)
- If U
Equation for Acceptance Probabilities (Alpha) in Hastings-Metropolis Algorithm
alpha(i,j) = min { fx(j)q(j, i) / fx(i)q(i,j) , 1}
General Monte Carlo Algorithm
- Draw a random number
- Process random number e.g. by an equation
- Repeat (1) and (2) many times
- Analyze cumulative result to find estimation of non-random variable
Note: It is stochastic and static
One-Dimensional Monte Carlo Algorithm
to estimate integral between 0 and 1
θ = E[g(U)], U~U(0,1)
- unbiased: E(θ^) = θ
- consistent: θ(k)^ -> θ as k->∞
Follows from Strong Law of Large Numbers
One-Dimensional Monte Carlo Algorithm Extension
to estimate integral between a and b
Still take expectation θ = E(X)
But now X~U(a, b)
Transform: Y~U(0,1) -> X = (b-a)Y + a
Monte Carlo Approximation of π
Circle inside square π=4*(#darts hitting circle/#darts hitting square-circle) Generate (X,Y), X~U(-1,1) and Y~U(-1,1) Check if point in circle: I = 1 if X^2 + Y^2 = 1 I = 0 otherwise
π = 4E(I)
Multi-Dimensional Monte Carlo Algorithm
many integrals of one multidimensional function - 0,1
θ = E[g(u(1),…,u(n)], U~(0,1)
Note: f(u(1),…,u(n)) (u(1),…,u(n)) = fu1(u1)…fun(un) = 1
Markov Chain Monte Carlo Idea
Want to sample from or find the value of probability distribution
We perform a random walk: once we have starting point, randomly pick nearly point and evaluate probability: if higher than starting point move there.
If we repeat enough times, hit every point with frequency=probability
When is the Hasting Sampler considered a Metropolis Sampler?
When the transition matrix Q is symmetric
Q = Q(T)
Equation for Transition Probability Matrix Q’
q’(i,j) = q(i,j)*alpha(i,j) for i≠j
Transition matrix so row sum=1, fill in for i=j