Lecture 2 Flashcards
Deep Learning problem
Large training set
Expensive to make
Few shot learning
N-way k-shot
N classes, k examples per class
Few shot learning (train/val/test)
Make sets on class level (Few shots for new task)
Few shot learning (Total)
Learn a learner that can learn in few shots, not an algorithm that can classify well
Pre-training, Fine-tuning
Copy weights from previous task (expensively learned)
Fine tune with fewer shots
Deep meta-learning
Learn good hyperparameter -> learn a good learner
Hyperparameters of NN
Initialization parameters
Optimization Algorithm (SGD)
Loss function (MSE, cross-entropy)
Architecture (num and kind of layers)
Matching network
Compute embeddings of support and query using G(x) and F(x)
Compare embeddings (using cosine similarity)
(not sure why meta learning yet)
Prototyical network
Matching network but with class centroids
(uses euclid distance instead of cosine sim)
Model Agnostic Meta Learning
Ff kijken hoe dit werkt
3 approaches
Metric-based
Optimization-based
Model-based