General Flashcards

1
Q

What is transfer learning?

A

Technique where we use a pre-trained model which are trained on a sufficient amount of dataset to perform on a similar task that is related to what it was trained on initially, or when there is limited data available for the second task.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What to do when target dataset is same as base model dataset?

A

Feature Extraction

Unfreeze only fully connected layers, and train it using new dataset suitable for required no. of classes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What to do when target dataset is different from base model dataset?

A

Fine tuning

Unfreeze fully connected layers as well as some of the layer of base model from the end, and train it using new dataset suitable for required no. of classes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Sequential model vs Functional model

A

Sequential model -
* 1 Input
* 1 Output
* Linear

Functional model -
* Application e.g., - Emotion and Age prediction of human
* Multi Input - Multi output model
* Non-linear
* Works using concatenation or addition

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Why was transfer learning not preferred for NLP tasks before 2018?

A

This is because of task specific job of NLP. However, with the introduction of ULMFiT (Universal Language Modelling Fine Tuning) research paper, where NLP model was trained for language modelling (which is next word prediction) as a pre-training task.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Why is language modelling preferred as a pre training task?

A
  1. Rich feature learning
  2. Unsupervised task
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly