Topic 2: Simple Linear Regression Flashcards

1
Q

In SLR model, if xbar = 0 then what does beta^_0=?

A

beta^0 = ybar

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Rearrange a SLR model to equal beta^_0

A

beta^_0 = ybar - beta_1*xbar

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Write the residual for observation i, u^_i, in terms of y_i’s

A

u^_i = y_i - y^_i

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the residual for observation i, u^_i equal to in terms of a SLR model?

A

u^_i = y_i - beta^_0 + beta^_1*x_i

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How to derive beta_1 from sums?

A

[the sum of (x_i - xbar)(y_i - ybar)] / [the sum of (x_i - xbar)^2] which is the sample covariance between x and y, divided by the sample variation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How would you find the coefficients of a model if given a table of the data?

A

find ybar and xbar, then use the formula for beta_1 (sample cov/sample var), then put these three into a formula to get beta_0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the formula for fitted values, Y^_i , of observation i in terms of a SLR model?

A

(y^_i) = (beta^_0) + (beta^_1)*(x_i)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the sum of squared residuals, SSR?

A

the sum of (u^_i^2) or the sum of (y_i - y^_i)^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is a simple regression model used for?

A

Used to study the relationship between two variables as a tool for empirical analysis.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Names for y variable in regressions?

A

dependent, explained, response, predicted, regressand

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Names for x variable in regressions?

A

independent, explanatory, control, predictor, regressor

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Describe the linear relationship in a simple regression model.

A

If the u term is held fixed at zero, x will have a linear affect on y, such that for every increase of a unit in x, there is a beta_1 change in y.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Besides E(u)=0, how can you assure a ceteris paribus relationship in your SLR model?

A

The zero conditional mean assumption of E(u|x) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the conditional expectation function for the population regression function?

A

E(y|x) = (beta_0) + (beta_1)x, when the zero conditional mean assumption holds

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the minimizing function for the derivation of the OLS estimators in SLR?

A

the sum of [(y_i) - (beta_0) - (beta_1)*(x_i)]^2. This is the ordinary least squares we are trying to minimize (the coefficients)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the residual of a SLR equal to in terms of parameters?

A

u = y - (beta_0) - (beta_1)(x)

17
Q

What are the first order conditions of the minimizing function for OLS estimators in SLR?

A

Take the partial derivative of the minimizing functions with respect to beta0 and beta1 to get two FOCs of: -2 sum of [(y_i) -(beta^_0) - (beta^_1)*(x_i)) = 0

18
Q

What is another way to write the formula for (beta^_1) = [the sum of (x_i - xbar)(y_i - ybar)] / [the sum of (x_i - xbar)^2]

A

(beta^_1) = [the sum of (y_i)(x_i - xbar)] / [the sum of (x_i - xbar)^2]

19
Q

What is homoskedasticity?

A

The errors in a regression have constant variance conditional on the explanatory variables, thus u has the same variance given any value of x.

20
Q

Under SLR1-4, do OLS estimates equal the true population parameters?

A

No, but under SLR1-4 the expected value of the estimates are equal to the parameter, making it unbiased.

21
Q

Would a violation of the zero conditional mean assumption make an estimator biased?

A

Yes.

22
Q

What is the expected value of residuals equal to?

A

0, both the sum and mean of residuals is 0.

23
Q

What is the bias in the estimator beta^1 if E(u_i) does not equal 0?

A

The bias in the estimator is equal to the sum of [(x_i - xbar)(u_i)] / the sum of (xi -xbar)^2

24
Q

What is the formula for the total sum of squares (SST)?

A

SST = the sum of (yi - ybar)^2 and is the measure of total sample variation in yi (thus the outcomes yi - ybar)

25
Q

What is the explained sum of squares (SSE)?

A

SSE = the sum of [(y^i - ybar)^2 and is the measure of sample variation in the fitted values (thus y^i)

26
Q

What is the residual sum of squares (SSR)?

A

SSR = the sum of u^i^2, the measure of variation in u^i

27
Q

What is the formula for total variation in y? (explained and unexplained?)

A

SST = SSE + SSR

28
Q

Stata: how do you get a frequency table of a variable?

A

tab var

29
Q

What is Var(beta^1)?

A

= sigma^2 / SSTx where SSTx = (sum xi- xbar)^2

30
Q

What is Var(y|x)?

A

= sigma^2, which is a constant and a measure of the variation in y given x.

31
Q

What is SLR 5?

A

Homoskedasticity, u has the same variance given any value of x, Var(u|x) = sigma^2 which is the error variance.

32
Q

What is Var(beta1)?

A

=0

33
Q

State the change in a level level model.

A

change in y = beta1 change in x

34
Q

State the change in a level log model.

A

change in y = (beta1/100)% change in x

35
Q

State the change in a log level model

A

% change in y = (100beta1) change in x

36
Q

State the change in a log log model.

A

% change in y = % change in x

37
Q

How can you remember the change of functional models?

A

A log with the variable makes it % change. For level-log beta1/100. For log-level beta1*100