Lecture 9 - Bayesian inference: approximation & sampling Flashcards

1
Q

What is Bayesian inference?

A

It involves updating the probability of a hypothesis as more evidence becomes available, using Bayes’ theorem.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the Laplace approximation?

A

It approximates a complicated posterior distribution with a simpler multivariate Gaussian 𝑁(πœ‡,Ξ£).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the key parameters in the Laplace approximation?

A
  1. ΞΌ is the mode of the posterior (maximum of 𝑔(𝑀)).
  2. Ξ£=βˆ’π»βˆ’1 , where 𝐻 is the Hessian matrix at w^\hat{}.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How is the mean πœ‡ of the Gaussian chosen in the Laplace approximation?

A

The mean πœ‡ is set to the maximum of the posterior distribution (𝑀^ ).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How is Ξ£ determined in the Laplace approximation?

A

Ξ£ =βˆ’π»βˆ’1, where 𝐻 is the Hessian matrix (second derivative of the log-posterior) at 𝑀^

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How does the Laplace approximation perform with the Gamma distribution?

A

It approximates well near the mode 𝑦^ but diverges significantly further away (Page 4, visual example).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is Monte Carlo sampling?

A

A technique to estimate integrals or expectations by averaging values from random samples

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How do you estimate the Bayesian predictive distribution?

A

By sampling 𝑀 from 𝑁(πœ‡,Ξ£) and averaging 𝑃(𝑑newβˆ£π‘€)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the Metropolis-Hastings algorithm?

A

It generates samples from the posterior by proposing and accepting/rejecting steps based on an acceptance ratio.

Mnemonic: β€œPropose, compare, accept (or stay).”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How is the acceptance ratio calculated in Metropolis-Hastings?

A

It combines the posterior ratio and the proposal ratio:

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What does the Metropolis-Hastings algorithm achieve?

A

It generates samples from a posterior distribution even when it cannot be computed analytically

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are common challenges in sampling?

A

High-dimensional spaces require many burn-in samples.

Risk of exploring only local maxima.

High rejection rates without careful proposal densities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the β€œburn-in” period in sampling?

A

The initial samples discarded because they may not represent the posterior accurately.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are the main steps of the Metropolis-Hastings algorithm?

A
  1. Propose a new sample 𝑀~𝑠 based on the previous sample π‘€π‘ βˆ’1
  2. Compute acceptance ratio π‘Ÿ.
  3. Accept 𝑀~𝑠 with probability
    min(π‘Ÿ,1); otherwise, keep π‘€π‘ βˆ’1
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What are some challenges with the Metropolis-Hastings algorithm?

A

Requires discarding β€œburn-in” samples.

Risk of exploring only local maxima.

May reject most proposals if the proposal distribution is poorly chosen.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly