14.1: The Simple Linear Regression Model and the Least Squares Point Estimates Flashcards
What is a Simple Linear Regression Model?
Why are scatter plots used in linear regression analysis?
Scatter plots are used to visualize the relationship between two variables and to decide if a straight-line relationship is appropriate to describe their association.
What are the components of the simple linear regression equation?
What does the regression line represent in simple linear regression?
In simple linear regression, the regression line represents the line of best fit through the data points on a scatter plot.
It shows the mean value of y for a given x and is expressed as
y = β0 + β1x,
where β0 is the y-intercept and β1 is the slope of the line.
What is the least squares method in the context of linear regression?
The least squares method is a statistical technique used to determine the line of best fit by minimizing the sum of the squares of the residuals (the differences between observed and predicted values).
How do you interpret the slope and y-intercept in a simple linear regression model?
The slope (β1) represents the change in the dependent variable for each unit increase in the independent variable.
The y-intercept (β0) is the predicted value of the dependent variable when the independent variable is zero.
How are the slope (β1) and y-intercept (β0) estimated in simple linear regression?
The slope (β1) is estimated as SSxy/SSxx, where SSxy is the sum of the products of the deviations of x and y from their means, and SSxx is the sum of squared deviations of x from its mean.
The y-intercept (β0) is estimated as the mean of y minus the product of the estimated slope and the mean of x.
What is a residual in linear regression?
A residual is the difference between an observed value of the dependent variable and the value predicted by the regression line. It represents the error in the prediction for that observation.
What does it mean when the regression analysis indicates an increase in both x and y, but cannot prove causation?
Regression can show that two variables move together and one variable can predict the other, but it cannot prove that changes in the independent variable cause changes in the dependent variable.
Other factors or third variables may be influencing both.
How do you calculate the slope (β1) in the least squares regression model?
The slope (β1) is calculated using the formula
β1 = SSxy / SSxx,
where SSxy and SSxx are the sums of the products of the deviations of x and y from their means, respectively.
How do you calculate the y-intercept (β0) in the least squares regression model?
The y-intercept (β0) is calculated using the formula
β0 = ȳ - β1x̄,
where ȳ is the mean of the dependent variable y,
x̄ is the mean of the independent variable x, and
β1 is the slope of the regression line.