Chapter 16 Flashcards

1
Q

What is a Simple Linear Regression and How do we use it?

A

MASTERING THE CONCEPT Page 423

16.1: Simple linear regression allows us to determine an equation for a straight line that predicts a person’s score on a dependent variable from his or her score on the independent variable.

We can only use it when the data are approximately linearly related.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the formula for predicting a z score by the Pearson Correlation Coefficient?

A

MASTERING THE FORMULA Page 424

16-1: The standardized regression equation predicts the z score of a dependent variable, Y, from the z score of an independent variable, X. We simply multiply the independent variable’s z score by the Pearson correlation coefficient to get the predicted z score on the dependent variable:

z = (rXY)(zX)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is regression to the mean

A

Page 425

■■Regression to the mean is the tendency of scores that are particularly high or low to drift toward the mean over time.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the intercept in a regression?

A

Page 426

■■The intercept is the predicted value for Y when X is equal to 0, which is the point at which the line crosses, or intercepts, the y-axis.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Explain the slope of a regression?

A

Page 426

■■The slope is the amount that Y is predicted to increase for an increase of 1 in X.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the simple regression formula?

A

MASTERING THE FORMULA Page 426

16-2: The simple linear regression equation uses the formula:

Ý= a + b(X).

In this formula, X is the raw score on the independent variable and Ý is the predicted raw score on the dependent variable. a is the intercept of the line, and b is its slope.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the formula for standardized regression coefficient ß?

A

MASTERING THE FORMULA Page 430 16-3:

The standardized regression coefficient, ß, is calculated by multiplying the slope of the regression equation by the quotient of the square root of the sum of squares for the independent variable by the square root of the sum of squares for the dependent variable:

ß = (b)*(√SSX / √SSY)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is ß or Beta Weight?

A

Page 430

■■The standardized regression coefficient, a standardized version of the slope in a regression equation, is the predicted change in the dependent variable in terms of standard deviations for an increase of 1 standard deviation in the independent variable; symbolized by ß and often called beta weight.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Explain Standardized Regression Coefficient?

A

MASTERING THE CONCEPT Page 431

16.2: A standardized regression coefficient is the standardized version of a slope, much like a z statistic is a standardized version of a raw score.

For simple linear regression, the standardized regression coefficient is identical to the correlation coefficient.

This means that when we conduct hypothesis testing and conclude that a correlation coefficient is statistically significantly different from 0, we can draw the same conclusion about the standardized regression coefficient.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How does regression work with correlation?

A

>Regression builds on correlation, enabling us not only to quantify the relation between two variables but also to predict a score on a dependent variable from a score on an independent variable. Page 431

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How does Standardized Regression Work?

A

>With the standardized regression equation, we simply multiply a person’s z score on an independent variable by the Pearson correlation coefficient to predict that person’s z score on a dependent variable. Page 431

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Explain Raw-Score Regression?

A

>The raw-score regression equation is easier to use in that the equation itself does the transformations from raw score to z score and back. Page 431

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the standardized regression equation used for?

A

>We use the standardized regression equation to build the regression equation that can predict a raw score on a dependent variable from a raw score on an independent variable. Page 431

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How do we graph the regression line?

A

>We can graph the regression line,

Ý= a + b(X)

based on values for the y intercept, a; the predicted value on Y when X is 0; and the slope, b, which is the change in Y expected for a 1-unit increase in X.

Page 431

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is does the standardized regression coefficient tell us?

A

>The slope, which captures the nature of the relation between the variables, can be standardized by calculating the standardized regression coefficient.

The standardized regression coefficient tells us the predicted change in the dependent variable in terms of standard deviations for every increase of 1 standard deviation in the independent variable.

Page 431

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Explain the relationship between the standardized regression coefficient to the Pearson correlation coefficient in a simple linear regression?

A

>With simple linear regression, the standardized regression coefficient is identical to the Pearson correlation coefficient. Page 431

17
Q

Define: Standard Error of the Estimate?

A

Page 433

■■The standard error of the estimate is a statistic indicating the typical distance between a regression line and the actual data points.

18
Q

Explain regression to the mean?

A

MASTERING THE CONCEPT Page 434

16.3: Regression to the mean occurs because extreme scores tend to become less extreme—that is, they tend to regress toward the mean.

Very tall parents do tend to have tall children, but usually not as tall as they are, whereas very short parents do tend to have short children, but usually not as short as they are.

19
Q

What is the Coefficient of Determination?

A

Page 435

■■The proportionate reduction in error is a statistic that quantifies how much more accurate predictions are when we use the regression line instead of the mean as a prediction tool; also called the coefficient of determination.

20
Q

What is the proportionate reduction in error and how is it calculated?

A

MASTERING THE FORMULA Page

438 16-4: The proportionate reduction in error is calculated by subtracting the error generated using the regression equation as a prediction tool from the total error that would occur if we used the mean as everyone’s predicted score. We then divide this difference by the total error:

r2 = (SStotal - SSerror) / SStotal

We can interpret the proportionate reduction in error as we did the effect-size estimate for ANOVA. It represents the same statistic.

21
Q

What is Proportionate Reduction In Error?

A

MASTERING THE CONCEPT Page 439

16.4: Proportionate reduction in error is the effect size used with regression. It is the same number we calculated as the effect size estimate for ANOVA.

It tells us the proportion of error that is eliminated when we predict scores on the dependent variable using the regression equation versus simply predicting that everyone is at the mean on the dependent variable.

22
Q

What is Multiple Regression?

A

MASTERING THE CONCEPT Page 440

16.5: Multiple regression predicts scores on a single dependent variable from scores on more than one independent variable.

Because behavior tends to be influenced by many factors, multiple regression allows us to better predict a given outcome.

23
Q

What is an Orthogonal Variable?

A

Page 440

■■An orthogonal variable is an independent variable that makes a separate and distinct contribution in the prediction of a dependent variable, as compared with the contributions of another variable.

24
Q

How does Multiple Regression Work?

A

Page 440

■■Multiple regression is a statistical technique that includes two or more predictor variables in a prediction equation.

25
Q

What is a Stepwise multiple regression?

A

Page 442

■■Stepwise multiple regression is a type of multiple regression in which a computer program determines the order in which independent variables are included in the equation.

26
Q

What is the difference between multiple regression and stepwise multiple regression?

A

MASTERING THE CONCEPT Page 443

16.6: In multiple regression, we determine whether each added independent variable increases the amount of variance in the dependent variable that we can explain.

In stepwise multiple regression, the computer program determines the order in which independent variables are added, whereas in hierarchical multiple regression, the researcher chooses the order.

In both cases, however, we report the increase in R2 with the inclusion of each new independent variable or variables.

27
Q

What is a Hierarchical multiple regression?

A

Page 443

■■Hierarchical multiple regression is a type of multiple regression in which the researcher adds independent variables into the equation in an order determined by theory.

28
Q

What is Structural Equation Modeling?

A

Page 444

■■Structural equation modeling (SEM) is a statistical technique that quantifies how well sample data “fit” a theoretical model that hypothesizes a set of relations among multiple variables.

29
Q

What is a statistical (or theoretical) model?

A

Page 444

■■A statistical (or theoretical) model is a hypothesized network of relations, often portrayed graphically, among multiple variables.

30
Q

Explain the term Path?

A

Page 445

■■Path is the term that statisticians use to describe the connection between two variables in a statistical model.

31
Q

What is Path Analysis?

A

Page 445 ■■Path analysis is a statistical method that examines a hypothesized model, usually by conducting a series of regression analyses that quantify the paths between variables at each succeeding step in the model.

32
Q

What are manifest variables?

A

Page 445

■■Manifest variables are the variables in a study that we can observe and that are measured.

33
Q

What are Latent Variables?

A

Page 445

■■Latent variables are the ideas that we want to research but cannot directly measure.

34
Q

How do we use Multiple Regression?

A

> Multiple regression is used to predict a dependent variable from more than one independent variable.

Ideally, these variables are distinct from one another in such a way that they contribute uniquely to the predictions. Page 447

35
Q

Explain a Multiple Regression Equation?

A

> We can develop a multiple regression equation and input specific scores for each independent variable to determine the predicted score on the dependent variable. Page 447

36
Q

How is Multiple Regression Used in The Real World ?

A

> Multiple regression is the backbone of many online tools that we can use for predicting everyday variables such as traffic or home prices. Page 447

37
Q

What is the difference between stepwise and hiearchical multiple regression?

A

> In stepwise multiple regression, a computer program determines the order in which independent variables are tested; in hierarchical multiple regression, the researcher determines the order. Page 447

38
Q

Explain How Structural Equation Modeling works?

A

> Structural equation modeling (SEM) allows us to examine the “fit” of a sample’s data to a hypothesized model of the relations among multiple variables, the latent variables that we hypothesize to exist but cannot see. Page 447