Important Definitions Flashcards
Overfitting
Happens when the model matches the training data so closely that the model fails to make predictions on new data. It performs well on training data but bad on evaluation data (testing and validation)
Underfitting
When the the model performing poorly on the training data. It fails to capture the relationship between the input examples and the target values (basically fails to understand the relationship between the input images and target value)
Training set
A subset of the dataset. Used to train (feeding the data to it and learning the good values for all the weights). Should include different examples than from the other sets.
Weight
A numerical value associated witu the connections between neurons/nodes. Training= determining ideal wifhtd. Inference=using weights fo make predictions
Neuron
A unit inside a hidden layer in a neural network. Neurons relay info to each other.
Feature
An input variable (which can have different values) to a model. This model has 224 × 224 × 3 (RGB) feautures)
Label
The result/answer part of an examplr
Example
Input (image) + label (class for the image)
Accuracy
The number of correct classification predictions divided by the total number of predictions.
Loss
During the training of a supervised model, a measure of how far a model’s prediction is from its label.
val-loss (Validation Loss)
A metric representing a model’s loss on the validation set during a particular iteration of training.
val-accuracy (validation accuracy)
The accuracy on the validation set. This accuracy is more important during training because it tests the model after its training (each epoch)
Test loss
Loss on the training dataset. When building a model, you typically try to minimize test loss. That’s because a low test loss is a stronger quality signal than a low training loss or low validation loss.
Training loss
A metric representing a model’s loss during a particular training iteration.