Final session 12a Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

Linear regression assumes that:

A

We have independent observations
linearity
Values of Y are independent
Variance of errors does not depend on the value of the predictor, X1
Errors, or residuals in the population, are normal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

explain linearity

A

Relationship between predictor and outcome is linear in the population

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

explain Values of Y are independent

A

and random sampling & random assignment are used

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

explain Variance of errors does not depend on the value of the predictor, X1

A

The variance of Y at every value of X1 is the same Homogeneity of variance, or homoscedasticity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

explain Errors, or residuals in the population, are normal

A

Y is normally distributed at each value of X1 (normality)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

what is normality

A

Y is normally distributed at each value of X (normality)

Errors, or residuals in the population, are normal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

explain Homogeneity of Variance Assumption

A

The variance of Y at every value of X is the same

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

how to go about Checking Assumptions:

A

Graphical Analysis of Residuals - “Residual Plot”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Plot residuals (e) vs. Yˆi do what

A

Examine functional form (Linear vs. Nonlinear)
One predictor at a time: e vs. Xi values also possible

Evaluate homoscedasticity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Multiple regression means what

A

we have more than one predictor variable, or IV

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

for Multiple” Regression, we describe how the DV (Y ) changes as what change

A

multiple IVs (Xj )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Questions leading to use of multiple regression:

A

Can we improve prediction of Y with more than one IV? Does our experiment have more than one IV?
Does our theory say that more than one IV affects the DV?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Equation for regression line:

A

Yˆ = b 0 + b 1 X 1 + b 2 X 2 + · · · b J X J

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

explain Intercept (b0)

A

Expected (average) value of Y when all predictors are equal to zero

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

explain Slopes (e.g., bj)

A

Expected change in Y when Xj increases by 1 unit, holding all other
predictors constant

Effect of Xj on Y , controlling for other predictors1
If we equate observations on the other predictors, what effect does Xj have?

Unique predictive effect of Xj on Y above and beyond other predictors Similar to a partial correlation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Testing individual regression coefficients, hypotheses

A

H0 :βj =0
Xj has no (unique) linear relationship with Y

H1 :βj doesn’t equal 0
A t-test can again be used

17
Q

Proportion of total variance in Y accounted by what

A

the regression model

Can be estimated by:
R2 = SSM / SST

18
Q

what are the factors of Squared multiple correlation (R2)

A

Ranges from 0 to 1

Larger R2, more variance of DV explained 0: No explanation; 1: Perfect explanation

19
Q

If new IV is added to the model, how does R2 change?

A

SSM will increase, and SSR will decrease
R2 will always increase
Even if the new IV is not related to the DV in the population
This increase may not be significantly different than zero

20
Q

what is Adjusted R2

A

Provides unbiased estimate of proportion of variance explained in the population

If a new IV is added that doesn’t help predict the outcome, Adjusted R2 may decrease