lecture 8 Flashcards
how are matrices used to explain power
let X be a matrix whose columns contain the data for
the explanatory variables and let y be a vector containing the data
for the dependent variable. The multivariate OLS estimator can be
calculated as:
what would happen if 2 variables are correlated
least-squares method will not allow us
to estimate separate coefficients for the two right-hand side
variables.
generally we might have less than perfect collinearity:
If this is the case then it is possible to estimate separate effects
using least-squares but the estimates may be very inaccurate
if an equation is loglinear what does the co-efficients measure
the elasticities
what does Rsquare measure
measure the the fraction of the variance of the endogenous variable –> explained by the model
in order to caculate the OLS estimator using the matrix form (multi varieate) , what are the 2 most important assumption
- k
what is perfect collinearity
Perfect multicollinearity occurs when two or more independent variables in a regression model exhibit a deterministic (perfectly predictable or containing no randomness) linear relationship
whats the difference between a bivariate and a multivariate regression
bivariate has 2 unknown, whilst multivariate has more than 3 unknowns
in terms of a matric regression what is the slope parameter
Bhat
in a multivariate regression what is the form of a 3 variable matrix regression line
Bhat = 𝛽+(𝑋^t * 𝑋)^(−1) * X^t*Y
how are able to have a matrix form of the multivariate regression (2)
only possible because (𝑋^t * 𝑋)^(−1) is invertiable
- number of columns is less than the number of rows
- columns of X must be linearly independant
what are the 2 most important assumptions of OLS in terms of a multivariate regression
- k less than N, where N is the number of observations
2. The X variables are linearly independent.