Week 3 - Derivation of the OLS Estimator Flashcards

1
Q

What does an OLS estimator do?

A

Minimises Residuals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Explain {xi: i = 1, 2, 3, .., n} and write it another way

A

denotes a series of n numbers, then we can write
the sum of these numbers as:
n

i=1
xi ≡ x1 + x2 + x3 + … + xn

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Summation Property 1

A

n
∑ c = nc
i=1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Summation Property 2

A

n. n
∑ cxi = c ∑ xi
i=1 i=1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Summation Property 3

A

n
∑ (axi + byi) = a ∑ xi + b ∑ yi
i=1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Summation Property 4

A

x¯ = (∑ xi) / n then ∑ (xi − x¯) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Summation Property 5

A

∑ (xi - x(hat))^2 = ∑xi^2 - n(x(hat))^2 = ∑xi (xi − x¯)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Summation Property 6

A

∑ (xi − x¯)(yi − y¯) = ∑xi(yi − y¯) (and with y)
∑(xi − x¯)(yi − y¯) = xi yi − n(x¯y¯)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Explain Expected Values

A

For any discrete random variable, X, taking on a finite number of values
{xi
: i = 1, 2, .., n}, then the expected value of X is the following weighted
average:

E(X) = x1f (x1) + x2f (x2) + … + xnf (xn) ≡ ∑xif (xi)

where f (xi) is the probability density function of X (i.e. frequency).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

EV Property 1

A

E(c) = c

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

EV Property 2

A

For any constants a and b:

E(aX + b) = aE(X) + E(b) = aE(X) + b

This implies that:

E(aX) = aE(X)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

EV Property 3

A

If {ai: i = 1, 2, .., n} are constants and {Xi: i = 1, 2, .., n} are random
variables, then:

E(a1X1 + a2X2 + … + anXn) = a1E(X1) + a2E(X2)… + anE(X)

which, using summations, can be re-written as:

E∑(aiXi) = ∑aiE(Xi)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Explain Covariance

A

Given two random variable, X and Y , let µX = E(X) and µY = E(Y ),
the covariance is computed as

Cov (XY ) = E[(X − µX )(Y − µY )] = E(XY ) − µX µY

If E(X)=0, or E(Y)=0, then,

Cov (XY ) = E(XY )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

EV Property 4

A

Given two random variables X and Y , then

E(XY ) = E(X)E(Y )

when X and Y are independent,i.e., the outcome of observing X does
not change the probabilities of the possible outcomes of Y , and
viceversa.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

COV Property 1

A

If X and Y are independent, then
Cov (XY ) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

COV Property 2

A

For any constants a1, b1, a2 and b2:
Cov (a1X + b1, a2Y + b2) = a1a2Cov (XY )