Linear and Logistic Regression Flashcards
For OLS linear regression on p random variables, what are the input, outcome, action and hypothesis spaces?
IR^p, IR, IR, {w\cdotx + b}
What is the OLS loss function?
SE loss
Explain the purpose of the gradient descent algorithm
Lazy algorithm: initialise with some weights and biases, calculate the gradient of R with respect to those, update the weights and biases in the direction of the gradient, continue until some condition is met.
Define convexity
A convex function has a chord that is always above itself.
For a convex function, what does gradient descent always result in?
convergence to the minimum (providing the step size is small enough)
What are the two parameters we must specify before applying gradient descent and what are the potential consequences of setting these too large or too small?
Step size and stopping criteria.
If step size too large, may diverge, too small and may take too long to approach the minimum
When can/can’t we use gradient descent?
When our loss is differentiable and our hypothesis space is finite dimensional.
Explain mini-batch and stochastic gradient descent and their pros/cons.
Mini-batch: gradient calculated on a random subset of the training data. Stochastic: mini-batch with batch size 1.
Explain why feature scaling is important in gradient descent.
If different features have different scales, step size can lead to issues (one step size scale for many scales of weights and biases).
For logistic regression on p random variables, what are the input, outcome, action and hypothesis spaces?
Input: IR^p
Output: {0, 1}
Action: (0, 1)
Hypothesis: sigmoid(w.x + b)
What is the commonly used loss function for logistic regression?
log loss
What is the likelihood function?
Probability of seeing our data given our parameters
Explain how the maximisation of the likelihood function is equivalent to minimising the log loss function
Maximising the likelihood is the same as maximising the log of the likelihood, use some log laws and we can minimise the log loss.