Formulas Flashcards
y-y^
= y-b1-b2x1
b1
Constant
y bar - b2 x bar
MSE
E(b- beta)^2 = var(b) + (bias(b))^2
var(b)
Check notes
Cov
Check notes
Var(b2)
Sigma squared/ sum (x-x bar)^2
Var(b1)
sum x^2/N (b2)
cov(b1,b2)
-xbar (b2)
Reject or accept h null when p value<sig level
Reject null
Reject or accept h null if t value greater than table when looking at h> c
Reject
Properties of OLS estimators
Unbiased (estimated value is an accurate value of true parameter)
Variance/ se
Efficiency
Se
Seb1 =root varb1
Seb2= root varb2
Guass-Markov Theorem
Under the assumptions of slr1-5, the estimators b1 and b2 have the smallest variance among all other values
Bet Linear Unbiased Estimator of B1 &B2
Central limit theorem
If the slr1-slr5 assumptions hold and N is sufficiently large, then the least squares estimate will have a normal distribution
Normalised distribution (Z)
X-u/ sd
b2-B2 / root var B2
Type 1 error
Reject the null hypothesis when it’s right
Type 2
Accept null when it’s wrong