Bayesian Flashcards

1
Q

Bayesian

A

treats parameters as random variables with distributions that represent uncertainty in estimates

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Prior

A

initial belief before observing data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Posterior

A

updated belief after observing data - combines prior and likelihood

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Likelihood

A

probability of observed data given parameters

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Conjugate Prior

A
  • prior that when combined with the likelihood results in a posterior of the same family as the prior
  • simplifies computation and easier interpretation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

MCMC

A
  • Markov Chain Monte Carlo
  • generates samples from complex distributions that are difficult to sample from directly
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Markov Chain

A

value of the random variable at the next step depends on the value at the current state

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Monte Carlo

A

random sampling techniques to approx the target distribution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Model Convergence

A

after enough iterations, the chain stabilises and samples approx the true posterior

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Metropolis Hastings

A
  • MCMC method
  • proposes new samples which are accepted or rejected based on acceptance probability
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Gibbs

A
  • special case of MH
  • each parameter sampled conditionally on others
  • updates one parameter at a time whilst the others stay fixed
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

how to assess model convergence

A
  • Traceplots
  • Gelman PSRF
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Traceplots

A
  • visual samples over number of iterations
  • want well-mixed, stationary distrbution - explored the whole space and not stuck in one region
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

PSRF

A
  • compute the between-chain variance and compare to the within-chain variance of multiple chains
  • if = to 1 - converged (ratio equal)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

how to do model checking

A

poserior predicitve model checking

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Posterior Predictive Checking

A
  • for each MCMC iteration, simulate new values for each parameter set - replicate data
  • compare to original data either through values or summary statistics
17
Q

Moving Window Approach

A
  • assess temporal structure of replicater time series captured
  • define a window length and move window through the replicates - generating overlapping windows
  • compute summary statistics i.e. sd for each window
18
Q

Hierarchical Model

A
  • multi-level model
  • parameters depend on parameters at a higher level
  • useful for grouped data - handles variability across groups
19
Q

Hierarchical Mixed-effects Model

A
  • combines both fixed and random effects
  • allows both population-level and group-specific variations
20
Q

Fixed Effect

A
  • assume individual differences are constant across groups
21
Q

Random Effect

A
  • assume individual differences are drawn froma probability distribution
22
Q

Nested Effects

A
  • lower-level parameters are contained within high-level parameters
  • account for dependencies, simplfing the model and improving MCMC mixing
23
Q

Bayesian Coverage

A
  • % times a posterior predictive interval with given uncertainity interval contains the original value
  • i.e. 95% uncertainity intervals, 95% should include the original data value
  • higher - too uncertain
  • lower - too confident