Section 3 Flashcards

1
Q

Key point about normally distributed variables?

A

A linear transform of a normally distributed variable is also normally distributed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Show that the OLS estimator is normally distributed?

A

See page 1 my notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does the normality of the error term directly imply?

A

Normality of the OLS estimator

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the 4 steps of obtaining an estimator of the error variance?

A

1) define RSS, e’e as: e’e=ε’M’Mε=ε’Mε
2) take trace tr(ε’Mε)=ε’Mε (it’s a scalar tf trace of the function will equal itself)
3) take expectation of e’e -> σ^2tr(M)
4) find tr(M)

Do proof, is in my notes page 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Steps to find the distribution of the error variance? And proof?

A

1) rewrite RSS as a function of the error variance: RSS=e’e=ε’Mε
2) normalise error distribution to get standard normal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Show M is symmetric and idempotent?

A

See proofs bottom of page 3 booklet

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Find an estimator for the error variance AND show it is unbiased?

A

See page 1 sides 1 and 2 of notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Steps to find the distribution of the error variance? and proof?

A

See page 1 side 2 of notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is distributional result 3.2?

A

For a standard normal variable x~N(0,I), and symmetric and idempotent matrix A with rank r, then:

x’Ax~χ^2
(r)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

For symmetric and idempotent matrices…

A

The rank is equal to the trace

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

When does zero covariance imply independence?

A

When the variables are ALSO normally distributed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is distributional result 3.3?

A

If x is a SNV and y is χ^2 with r DofF, and x and y are independent of one another, then the variable t shown below has a t-distribution with r degrees of freedom:

t=x/(y/r)^0.5 ~t(r)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Show that t(j)=(b(j)-β(j))/s.e.(b(j)) ~t(n-k)?

A

See 3.2.1 in my notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Explain how we can test for single linear combinations of regression parameters?

A

Using DR3.1, can show Rb-r has following normal distribution:
Rb-r ~ N(Rβ-r, σ^2R(X’X)^-1R’)
tf allows us to test if Rβ-r=0 or if Rβ=r

R is 1xk row vector and r is scalar, β is kx1 vector as normal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Key point to remember about normal and t distributions?

A

Populations that are normally distributed will have a t-test done on the samples chosen from them

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Explain how we can test for multiple parameter restrictions?

A

Using, again: Rb-r ~ N(Rβ-r, σ^2R(X’X)^-1R’)

By making R a pxk matrix (p=no. linear combos), Rβ-r is now a column vector and with Rβ-r=0 now a vector of zeros

Then, using the F statistic (see equations) we can test for multiple restrictions (try prove the F stat too!)

~F(p,n-k) (DofF)

17
Q

What is p?

A

Number of linear combinations to test; ie. if β2=β3=0 then p=2 (number of = signs)

18
Q

See

A

end of section 3 for examples

19
Q

Learn DR 3.4 and 3.5!

A

and proof of F!!!