Week 3 - Derivation of the OLS Estimator Flashcards
What does an OLS estimator do?
Minimises Residuals
Explain {xi: i = 1, 2, 3, .., n} and write it another way
denotes a series of n numbers, then we can write
the sum of these numbers as:
n
∑
i=1
xi ≡ x1 + x2 + x3 + … + xn
Summation Property 1
n
∑ c = nc
i=1
Summation Property 2
n. n
∑ cxi = c ∑ xi
i=1 i=1
Summation Property 3
n
∑ (axi + byi) = a ∑ xi + b ∑ yi
i=1
Summation Property 4
x¯ = (∑ xi) / n then ∑ (xi − x¯) = 0
Summation Property 5
∑ (xi - x(hat))^2 = ∑xi^2 - n(x(hat))^2 = ∑xi (xi − x¯)
Summation Property 6
∑ (xi − x¯)(yi − y¯) = ∑xi(yi − y¯) (and with y)
∑(xi − x¯)(yi − y¯) = xi yi − n(x¯y¯)
Explain Expected Values
For any discrete random variable, X, taking on a finite number of values
{xi
: i = 1, 2, .., n}, then the expected value of X is the following weighted
average:
E(X) = x1f (x1) + x2f (x2) + … + xnf (xn) ≡ ∑xif (xi)
where f (xi) is the probability density function of X (i.e. frequency).
EV Property 1
E(c) = c
EV Property 2
For any constants a and b:
E(aX + b) = aE(X) + E(b) = aE(X) + b
This implies that:
E(aX) = aE(X)
EV Property 3
If {ai: i = 1, 2, .., n} are constants and {Xi: i = 1, 2, .., n} are random
variables, then:
E(a1X1 + a2X2 + … + anXn) = a1E(X1) + a2E(X2)… + anE(X)
which, using summations, can be re-written as:
E∑(aiXi) = ∑aiE(Xi)
Explain Covariance
Given two random variable, X and Y , let µX = E(X) and µY = E(Y ),
the covariance is computed as
Cov (XY ) = E[(X − µX )(Y − µY )] = E(XY ) − µX µY
If E(X)=0, or E(Y)=0, then,
Cov (XY ) = E(XY )
EV Property 4
Given two random variables X and Y , then
E(XY ) = E(X)E(Y )
when X and Y are independent,i.e., the outcome of observing X does
not change the probabilities of the possible outcomes of Y , and
viceversa.
COV Property 1
If X and Y are independent, then
Cov (XY ) = 0