Econometrics 2: Bivariate Linear Regression 2.3 - Ordinary Least Squares Estimation Flashcards
Describe & explain the first-order conditions of OLS
OLS is the procedure that minimises the sum of squared residuals π = βtop right n bottom right i=1 πΜbottom rightπ ^2, i.e.
we need to
min π = min βtop rightn bottom right i=1 πΜbottom rightπ ^2 = min β(πbottom rightπ β π½Μhat bottom right0 β π½Μhat bottom right1 πBottom rightπ)^2
We know that to find the maximum or minimum of a function we need to differentiate and set to
0. This gives what we call the first-order conditions:
ππ/ππ½Μhat0 = β2 β( πbottom rightπ β π½Μhat0 β π½Μhat1ππ) = 0 (4)
ππ/ππ½Μhat1 = β2 β ππ( ππ β π½Μhat0 β π½Μhat1ππ) = 0 (5)
On solving these equations, we find that
π½Μhat0 = πΜ
bar β π½Μhat1πΜ
bar (6)
π½Μbar1 = β(ππβπΜ
bar)(ππβπΜ
bar) / β(ππβπΜ
bar)^2 = β ππππβππΜ
barπΜ
bar / β ππ^2βππΜ
bar^2 (7)
These are the OLS estimators of π½0 and π½1 and as you can see, they are simply equations that
involve our data, π and π. We can now see how we combine data with a statistical technique to
get estimates of the unknown parameters in the regression model. We have our data on variables
π and π. We have the statistical technique of OLS and this gives us the formulae with which we
can use the data in order to get our estimates, i.e. input the values of our dataset, π and π, into our
estimator equations (6) and (7), and out pop two numbers, one an estimate of π½0, the other an
estimate of π½1