Gaussian linear mixed models (GLMMs) Flashcards

1
Q

Two-stage sampling scheme

A

1: m groups
2: ni units ∀ i=1,…m

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Two-level hierarchical units and regressors

A

1: ni observations ∀ i=i,..m
2: m groups, regressors that are common for all observations in the same group

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

IntraClass Correlation (ICC) for intercept-only GLMM

A

cor(Yij, Ylh) = τ02 / (τ02 + σ2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Random intercept and slope GLMM model

A

Yij = β0 + γ0i + β1 xij + γ1ixij+ εij = β0i + β1i xij + εij
∀ j=1,…ni, i=1,…m
- E[Yij] = β0 + β1 xij
- E[Yij0i] = β0i + β1i xij
- Var(Yij) = σ2 + τ02 + τ12xij2 + 2τ01xij
- Cov(Yij, Ylh) = τ02 + + τ12xijxlh + τ01(xij+xlh) if i=l, 0 otherwise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

GLMM general definition

A

Y = Xβ + Uγ + ε
- Xnxp
- βp=k+1
- εnx1 ∼ Nn(0, σ2In)
- Unxm(q+1) = blockdiag(U1,…Um)
- γm(q+1)x1 ∼ Nm(q+1)(0, G=blockdiag(Q,…Q)) independent of ε
- E[Y] = Xβ
- E[Y|γ] = Xβ + Uγ
- cov(Y) = V = UGU’ + σ2In
- cov(Y|γ) = σ2In

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Marginal and conditional formulations of GLMMs

A

Y ∼ Nn(Xβ, V=UGU’+R)
Y|γ ∼ Nn(Xβ + Uγ, R) and γ ∼ Nq(0, G)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

GLMMs seen as GLMs

A

ε* = Uγ + ε ∼ Nn(0, V = UGU’+σ2In)
Y = Xβ = ε* ∼ Nn(Xβ, V)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Extension of GLMMs to heteroschedasticity and error dependence

A

Y = Xβ + Uγ + ε
- ε ∼ Nn(0, R)
- Rnxn = blockdiag(R1,…Rm)
Heteroschedasticity: Ri = σ2Ini, stratification of variances
Error correlation: Ri for autocorrelation (ARMA)
To avoid overparametrization it is important to put emphasis either on R or UiQUi’ (one emphasised and the other simple since now V = UGU’ + R)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

GLMMs fitting in R

A

lme(fixed=y~lvl1+lvl2, random=~lvl1|group, method=”REML”)
- random = pDiag(~lvl1) defines a diagonal Q
- weights = varIdent(form=~1|strata) defines heteroschedasticity between strata

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

GLMMs coefficients in R

A

fixef(fit) #Beta
ranef(fit) #Gamma
coef(fit) #Conditional formulation, betai + gammai

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Hypothesis testing on Betas in R

A
  • H0: Betai = 0
    summary(fit) #Check t
    anova(fit, type=”marginal”) #t2 from summary
  • H0: CBeta = 0
    anova(fit, L=C) #Valid only for one level regressors at a time, not mixed!
  • Nested models:
    anova(fitmin, fitmax) #Valid only if method=”ML” for both!
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Hypothesis testing on Gammas in R

A
  • H0: uncorrelated gammas
    anova(fit_pDiag, fit_Default)
  • H0: excluding gammai
    L = anova(fit_excluded, fit_not)$L.Ratio[2]
    p = 0.5*(1-pchisq(L, r-1))+0.5(1-pchisq(L, r))
  • H0: strata heteroschedasticity is as valid as group heteroschedasticity
    anova (fit_strata, fit_group)
  • H0: homoschedasticity is as valid as heteroschedasticity
    anova(fit_homo, fit_het)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Number of parameters of matrix Q

A
  • DProportional to identity matrix: 1
  • Compound symmetry: 2
  • Diagonal: q
  • Identical covariance, different variance: q+1
  • Unstructured: 1/2 q(q+1)
    Total number of variance parameters: parameters of Q + 1 (σ2)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly