L6 Non Linear Regression Models Flashcards
The expectations of the general non linear regression functions are the same as those of the linear ones except that it expects linear parameters instead of a linear relationship
True
Hypothesis testing is the same on polynomial regression models as in linear ones
True
Mesures and coefficients can always be taken at face value in polynomial regression models
True
In the log-log model a percentage change in x has an asociated fixed b1 change in Y
False a % change in X has an associated b1% change in Y
How do you compare two cases dependent on a binary dummy variable
You check the function if the dummy variable is 1 and subtract it with the function if the dummy variable is zero to get the difference.
Will a continous variable that interacts with a dummy variable effect the slope
yes, if the continuous variable is positive then omitting it through the deactivation of a binary dummy variable will make the slope less steap
What is probably the reason for a failure to reject the t-test of each variable in a multiple regression model when the f-test can be rejected.
High multicolinearity.
How can some functions of non linear regression models be described as linear regression models
By using functions of the independent variable such as ln(x) and x1*x2, however this often introduces issues of multicolinearity so F-tests are needed.
Polynomial regression models are linear in the coefficients
True
When doing one tailed hypothesis tests on a linear regression model can we use the t-statistic given to us by statistical software
no becouse most statistical software only perform two tailed hypothesis tests.
How do you check if the F-statistic is large enough
You get its P-value P(F) = how unlikely it is for all slope coefficients to be zero
What does a wave like pattern in a residual plot of time series data illude to
Correlated observations that makes the ols estimators standard errors inapropriate although it remains unbiased.
How do you calculate R² of a function with a response variable with a logarithmic value.
you exponentiate the logarithmivc function ln(y) = f(x) -> y = exp(f(x)). Then you use the function to calculate the estimated values at each given data point. Lastly you use the correlation function to correlate the data from the estimator function with your actual sample data. Then you get the r² by squaring the correlation.