Bias and Variance Flashcards
What is variance?
This is a measure in how spread out the values of a data set are
What is the bias?
This is how accurate the data points are to the target value
What happens with overfitting with regards to the bias and variance?
When we start overfitting we get a better representation of the function, lowering the bias, but we become more dependent on the data set, so the variance becomes higher.
What is the equation for bias-variance decomposition?
ED[(y - f)2] = ED [(y - ED [y])2] + ED[(ED[y] - f)2]
Which part of the decomposition is the variance?
Var = ED [(y - ED [y])2] This is the difference between the model and the mean of the model
Which part of the decomposition is the bias2?
Bias = ED[(ED[y] - f)2] This is the distance between the mean and the actual value.
What happens during training (and overfitting) to the bias and variance?
We are shifting the error from the bias to the variance
What happens to the expected error of a model during training?
The error of the model does not change but it is shared and shifts between bias and variance
How can the bias and variance be shown for this graph? [Picture 8]
> The distance between the average of all the possible models that can be learned and the original function is the bias
> If I take all possible models, the difference between the models and the average is the variance