Chapter 1 Concepts plus regression review Flashcards

1
Q

Three principles of design:

A
  • replication
  • randomization
  • local control of error (blocking)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Categories of experimental problems

A
  • Treatment comparisons
  • Variable screening
  • Response surface exploration
  • System optimization
  • System robustness
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Steps to experiment planning

A
  1. State objective
  2. Choose response
  3. Choose factor and levels
  4. Choose experimental plan
  5. Perform experiment
  6. Analyze the data
  7. Draw conclusions and make recommendations
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Experimental units vs observational units

A
  • Experimental unit is item that is being experimented on and measured
  • Observational unit can be thought of as a technical replicate
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Replication does the following:

A
  • Replicating experimental units:
    • Allows estimation of the experimental error
    • Improves the precision of the estimates
  • Replicating observational units:
    • mimizes the impact of measurement error
    • does NOT estimate experiment error
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Randomization does the following

A
  • Distributes the impact of any systematic bias
    • ensures fair comparisons
    • if bias is present, inflatest the estimate of error
  • Elimates presumption bias
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Local control of error does the following

A
  • Also called blocking
  • reduces the random error among the experimental units
  • controls for anything which might affect the response other than the factors

Two important forms of local control of error:

  • blocking
  • covariates
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Block

A

Groups of homogeneous units

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Blocking

A
  1. arranged so that within block variation is smaller than between block
  2. should be applied to remove the block-to-block variation
  3. randomization is applied to assignments of treatments within the blocks
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Derive the LSE estimators without matrix notation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Derive the LSE estimators using matrix notation

A

(X’X)-1X’y

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Derive the expectation, variance and covariance estimates of the LSE estimators without using matrix notation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Derive the expectation, variance, and covariance of the LSE estimators using matrix notation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How do you estimate σ2 in LSE

A

σ_hat2 =

MSE =

RSS/(N-k-1)=

SUM (yi - yi_hat)2 / (N-k-1)

yT(I-H)y / (N-k-1)

For SLR, N-k-1 = N-2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

degrees of freedom in multiple linear regression

A

df overall = N-1

df model = k

df error = N-k-1

Note: this assumes k regressors + intercept

such that model matrix is n x k+1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

standard error beta1_hat

A

=sqrt(MSE/Sxx)

17
Q

SST =

A

SST = SSReg + SSRes

18
Q

R2 = (in SS components)

A

R2 = SSreg/SST = 1 - SSres/SST

19
Q

R2 = (Pearson format)

A

R2 = SxySxy / SxxSyy

20
Q

ANOVA style F stat for MLE

A

F = (MSReg/df1) / (MSres/df2)

where df1 = difference in full and reduced model parameters

df2 = total df - number of parameters in full model

21
Q

Confidence interval for beta1_hat in SLR

A

beta1_hat +/- tn-2,alpha/2 * se(beta1_hat)

22
Q

Confidence/prediction interval for y_hat( at x0)

A

CI: y_hatxo = tN-2,alpha/2 * sqrt(MSres) * sqrt(1/N + (X_bar - Xo)2/Sxx)

PI: y_hatxo = tN-2,alpha/2 * sqrt(MSres) * sqrt(1+ 1/N + (X_bar - Xo)2/Sxx)

23
Q
A
24
Q
A
25
Q

Extra sum of squares principle

A
  • Two models: Model I (reduced) and Model II (full)
  • Model I is a sub-model of Model II
  • F stat = (SSred - SSfull)/(q-k) /
  • (SSfull/df_full)
26
Q

R2, Ra2, and Cp (SS’s)

A

R2 = 1 - SSres/SST

Ra2 = 1 - MSreg / SST/df_total

Cp = SSres/MSE - (N-2p)

27
Q

AIC and BIC

A

AIC = N ln (SSres/N) + 2p

BIC = N ln (SSres/N) + p ln (N)

28
Q

Model selection methods

A

Forward selection

Backwards elemination

Stepwise

Best subsets

29
Q

E(Cp) =

A

If model is true, E(RSS) = (N-p)σ2

E(Cp) = (N-p)σ22 - (N-2p)

=p

30
Q

MLR CI for beta_hatj

A

beta_hatj +/- tN-k-1, alpha/2 * sqrt ( MSE * (X’X)-1jj )

31
Q

MLR contrast

aTbeta ~ ( , ) and

test stat

A