C3W2 Flashcards
Train-dev set
Portion of train set which comes from the same distribution as dev set
What to do when overfitting to the dev set?
Get bigger dev set
What is data mismatch?
When your dev and test sets come from different distribution, and that’s why you model performs bad on the dev/set
Transfer learning with small dataset
Retrain just last layer
Transfer learning with large dataset
Retrain hidden layers as well as output layer
What is pretraining/fine tuning
Pre-training - train on unrelated data
Fine tuning - train on target data
When transfer learning makes sense?
When you have large data from the problem you transferring from, and less data for the problem you transferring to
When task A and B have the same input
Low level feature from A may help B
When use Multi task learning
Training in a set of tasks that could benefit from sharing low level features
You have similar amount of data for each task (especially small amount)
Can train big enough neural network
What is end to end learning?
Bypass intermediate steps in learning
You need large dataset for this
Types of layers in ConNet?
Convolutional
Pooling
Fully connected
What is pooling layer and when to use it?
Divide image pixels into equal regions and take max from each region.
It has 2 parameters, but gradient decent doesn’t learn anything
What is average pooling?
Take average of the image regions
Hyper parameters for pooling(max/average)?
F - filter size
S - stride (step)
What is fully connected layer?
When you connect layer to another layer with less neurons