Bayesian Flashcards
Bayesian
treats parameters as random variables with distributions that represent uncertainty in estimates
Prior
initial belief before observing data
Posterior
updated belief after observing data - combines prior and likelihood
Likelihood
probability of observed data given parameters
Conjugate Prior
- prior that when combined with the likelihood results in a posterior of the same family as the prior
- simplifies computation and easier interpretation
MCMC
- Markov Chain Monte Carlo
- generates samples from complex distributions that are difficult to sample from directly
Markov Chain
value of the random variable at the next step depends on the value at the current state
Monte Carlo
random sampling techniques to approx the target distribution
Model Convergence
after enough iterations, the chain stabilises and samples approx the true posterior
Metropolis Hastings
- MCMC method
- proposes new samples which are accepted or rejected based on acceptance probability
Gibbs
- special case of MH
- each parameter sampled conditionally on others
- updates one parameter at a time whilst the others stay fixed
how to assess model convergence
- Traceplots
- Gelman PSRF
Traceplots
- visual samples over number of iterations
- want well-mixed, stationary distrbution - explored the whole space and not stuck in one region
PSRF
- compute the between-chain variance and compare to the within-chain variance of multiple chains
- if = to 1 - converged (ratio equal)
how to do model checking
poserior predicitve model checking
Posterior Predictive Checking
- for each MCMC iteration, simulate new values for each parameter set - replicate data
- compare to original data either through values or summary statistics
Moving Window Approach
- assess temporal structure of replicater time series captured
- define a window length and move window through the replicates - generating overlapping windows
- compute summary statistics i.e. sd for each window
Hierarchical Model
- multi-level model
- parameters depend on parameters at a higher level
- useful for grouped data - handles variability across groups
Hierarchical Mixed-effects Model
- combines both fixed and random effects
- allows both population-level and group-specific variations
Fixed Effect
- assume individual differences are constant across groups
Random Effect
- assume individual differences are drawn froma probability distribution
Nested Effects
- lower-level parameters are contained within high-level parameters
- account for dependencies, simplfing the model and improving MCMC mixing
Bayesian Coverage
- % times a posterior predictive interval with given uncertainity interval contains the original value
- i.e. 95% uncertainity intervals, 95% should include the original data value
- higher - too uncertain
- lower - too confident