Transfer Learning Flashcards

1
Q

Transfer learning

A

Leveraging pre-training models by reusing feature representation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Why use transfer learning

A

You don’t have enough computational resources to train a model from scratch.
To save time compared to training from scratch.
When you have a small training set

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

When may transfer learning NOT work well?

A

The pre-training model feature can’t differentiate the classes in your problem. You remove too many layers from the pre-trained model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

General steps for transfer learning

A

Obtained the pre-training model.
Create a based model from the pre-trained model.
Freeze layers of the pre-trained model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is fine-tuning in the context of transfer learning?

A

Unfreezing the pre-trained model and retraining at a low learning rate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Where can you find pre-trained models for computer vision tasks?

A

Keras applications
Stanford’s ImageNet pre-trained models

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Which is typically used as a pre-trained model for NLP tasks?

A

Word embedding like Glove or Word2Vec & hugging face

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are three main ways to use a pre-trained model?

A

Prediction
Fine tuning
Feature extraction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the first step when using transfer learning with image data?

A

Get the pre-trained model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Why is it important to freeze batch normalization layers when using a pre-trained model?

A

To prevent the layer mean and variance from being updated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Valid data augmentation technique

A

a) Random rotation
b) Horizontal flipping
c) Zooming

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

When fine-tuning a model, what should you use? (Select all that apply)

A

Unfreeze the pre-trained model
Retrain at a low learning rate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Which of these is typically used as a pre-trained model for NLP tasks?

A

Word embeddings like GloVe or Word2Vec

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

When creating an embedding layer from pre-trained word embeddings, why is it important to set trainable=False?

A

To prevent the embeddings from being updated during training

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What type of layer is commonly used for sequential data like text?

A

Embedding

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is a callback used for when training models? (Select all that apply)

A

a) Stop training when metrics stop improving
b) Monitor metrics like loss and accuracy