Lecture 2 Flashcards

1
Q

Deep Learning problem

A

Large training set
Expensive to make

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Few shot learning

A

N-way k-shot
N classes, k examples per class

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Few shot learning (train/val/test)

A

Make sets on class level (Few shots for new task)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Few shot learning (Total)

A

Learn a learner that can learn in few shots, not an algorithm that can classify well

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Pre-training, Fine-tuning

A

Copy weights from previous task (expensively learned)
Fine tune with fewer shots

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Deep meta-learning

A

Learn good hyperparameter -> learn a good learner

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Hyperparameters of NN

A

Initialization parameters
Optimization Algorithm (SGD)
Loss function (MSE, cross-entropy)
Architecture (num and kind of layers)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Matching network

A

Compute embeddings of support and query using G(x) and F(x)
Compare embeddings (using cosine similarity)
(not sure why meta learning yet)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Prototyical network

A

Matching network but with class centroids
(uses euclid distance instead of cosine sim)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Model Agnostic Meta Learning

A

Ff kijken hoe dit werkt

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

3 approaches

A

Metric-based
Optimization-based
Model-based

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly