general 2 Flashcards

1
Q

regression is very sensitive to

A

rounding

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

dependent variable mean

A

line, the regression line

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

SSR=

A

SST-SSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

when SST is equal to SSE

A

when we don’t have an independent variable or we have just intercept amount.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Is related to the time that we estimate the line just with the average amount of dependent variable so

A

we have a horizontal line without slope.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

SSE is related to

A

regression model, diffrence between obsereved value and predicted value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

SST IS

A

DIFFERENCE between the observed value and the mean of the independent variable ( when we have a just horizontal line)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

SSR is

A

the difference between the two lines, the horizontal line that is the mean of the dependent variable without an independent variable and regression line with the independent variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

SSR is

A

the total residual of the regression line

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

r square =

A

SSR/ SST

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

degree of freedom in simple regression always is

A

two, because we just estimated the slope and intercept.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

MSE=

A

SSE/DEGREE OF FREEDOM

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

MSE or mean some square is the

A

the variance of the error shows how spreads of data point around the regression line.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

why the degree of freedom divided by n-2 and not divided just on n?

A

because we dealing with the sample, not the population, and divided on n give us the average.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

the standard error is

A

the standard deviation of the error term is the average distance of observation that falls from the regression line in its units of the dependent variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

the standard error formula is

A

the square root of the MSE

17
Q

Priority

A

first: SSE second: MSE third: s or standard error.

18
Q

when the beta equals zero then

A

the null hypothesis will be rejected.

19
Q

adding more independent variables will lead to

A

overfitting and some problem so it is not always a good procedure. second part: multicollinearity is one of these problems that means independent variables are correlated together.

20
Q

when two independent variables is correlated to each other

A

we cannot sure which of them explaining the variation in the dependent variable.

21
Q

regression types

A

regression model and regression equation and third estimated multiple regression.

22
Q

in multiple regression, each coefficient interpreted as

A

an estimate of the change in y variable corresponding to one unit change in one independent variable when all other variables remain constant

23
Q

the primary purpose of log in regression:

A

is the way of scale skewed data, get the form of the normal distribution to skewed data,

24
Q

price = 1079ln(x), interpret:

A

1 percent increase in x will to 1079/100 increase in price

25
Q

ln price = 0.197ln(x), interpret:

A

1 percent increase in x will lead to a 0.197 percent increase in y. when to side have ln no dividing and multiple!

26
Q

polynomial regression adds

A

extra independent variables that are the power of the original variable.

27
Q

quadratic model:

A

it is squared of independent variable its allow our model to capture curvature. in original scatter plot

28
Q

nonlinear quadratic model:

A

explained more variance, the tighter fit of observation around the regression line, reduce the model error.

29
Q

Thus, the change in y is simply

A

b1 multiplied by the change in x. This means that b1 is the slope parameter in the relationship between y and x, holding the other factors in u fixed.

30
Q

How can we hope to learn in general about the ceteris paribus effect of x on y, holding other factors fixed, when we are ignoring all those other factors?

A

As long as the intercept b0 is included in the equation, nothing is lost by assuming that the average value of u in the population is zero