L9 - The Regression Model in Matrix Form Flashcards
What does the value of β give you in Multivariate regression?
- the line means partial differentiation holding that variable constant
When is Multivariate regression equilvalent to Bivariate regression?
- Based on purged data
- So instead of running a multivariate regression model with 3 parameters, you can run a bivarate model 3 different times and get the same answer
- This is proved by the Frisch-Waugh-Lovell theorem
What is Least Square Model of regression in matrix form?
y(Ubar)= X(Ubar)β +u
The Guass-Markov assumptions are:
E(u) = 0
E(uuT)=σu2IN
X is fixed in repeated samples
The OLS estimator is:
β(hat)=(X(UBar)TX(Ubar))-1X(Ubar)Ty(Ubar)
What is the prooft that the OLS is unbiased unfer the Gauss-Markov assumptions?
- (LAST LINE) If we take the expectation of both sides you will see that the expect value of β(hat) equal the expected side of the other side of the equation
- The reasons why β is not in the expectation function is because its constant and the expected value of a constant is itself
- as from the GM assumptions we know the X values are fixed so we can take them out of the expectation operation so it is (X(Ubar)TX(Ubar))-1X(Ubar)T*E(u) but as E(u)=0 we get the final line proving the OLS is unbiased
How do you calculate the variance of the OLS estimator in matrix form?
- Line 2 –> shows both β(hat)-E(β(hat) and the transposed version
What is a variance-covariance matrix?
- Variance is a measure of the variability or spread in a set of data. Mathematically, it is the average squared deviation from the mean score.
- Covariance is a measure of the extent to which corresponding elements from two sets of ordered data move in the same direction.
- Variance and covariance are often displayed together in a variance-covariance matrix, (aka, a covariance matrix). The variances appear along the diagonal and covariances appear in the off-diagonal elements
What is the distribution of the OLS estimator in matrix form?
How do you find the sample-covariance matrix?
- just like the bivariate model as we dont know the actual value of the variance we can only use the estimator of the variance
- σu(hat)2–> is the residual sum of squares divided by its degrees of freedom
- k is the number of parameters we estimate, or restrictions used
How would you set up a hypothesis test of a multivariate regression model?
What does the Guass-Markov Theorem tell you about the variance-covariance matrix of any linear unbiased estimator?
The Gauss-Markov Theorem shows that, under the GM assumptions, the variance-covariance matrix of any linear unbiased estimator differs from the OLS v-cov matrix by a positive semi-definite matrix.
Hence OLS is BLUE.