Unit 5 - Representation Learning Flashcards

1
Q

What is transfer learning?

A

Talk in human context

Traditional ML and DL models trained in isolation. We have to train a model in scratch whenever the feature space distribution changes

Idea of overcoming the isolated paradigm

Definition - transfer learning enables to adapt or utilise the knowledge acquired from one set of tasks and/or domains to another related set of tasks and/or domain

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What to transfer?

A

To answer this question, it is necessary to know which part of the knowledge can be transferred from the source to the target so that the performance of the target is improved

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Approaches to “What to transfer”:

A

Instance transfer: cannot use the source domain data directly. Use certain instances of the source domain.

Feature-representation: to minimise domain divergence and reduce error rates by identifying good feature representations that can be utilised from the source to the target

Parameter: related tasks share some parameters or prior distribution of hyper parameters. Apply additional weightage to the loss function of the target do to improve the performance.

Relational knowledge: handles non-IID data. Each data point has some relationship with other data points, for example social network data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

When to transfer?

A

Using transfer knowledge, for the sake of it may make matters worse

So we do not want to degrade the task, performance and aim to utilising class into improve target task, performance or results

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How to transfer?

A

When winner answer, think about identifying ways to transfer learning across domains or task

This may mean changes in the existing algorithms are using different techniques

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Types of transfer learning:

A

Inductive transfer: source and target domains are same, but the tasks are different from each other. Use of inductive biases to improve target task

Unsupervised transfer: similar to inductive transfer, but focus on unsupervised tasks on the target. No labelled data

Transductive transfer: similarities between source and target tasks, but corresponding domains are different. Source has a lot of label data target has none. Sub categories are based on settings that are either feature. Space is different or marginal probabilities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is distributed representation?

A

Important concept in the field of deep learning

Simply a vector that represent some piece of data

Example of examining different types of shapes.

Ability to capture, meaningful semantic similarity between data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Variants of CNN: DenseNet

A

Consist of:
Dense block
DenseNet architecture
Advantages of DenseNet

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is dense block?

A

Uses element by element addition

an algorithm that is handed a state from one ResNet model to another

Each layer receives extra inputs from all the layers that came before it and also transmits it on feature map to all the letters that came after it

Uses concatenation

Generates a more compact and thin network with your channel

Greater memory and processing efficiency

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Advantages of DenseNet

A
  1. Strong gradient flow
  2. Parameter and computational efficiency
  3. More diversified features
  4. Maintain low complexity features
How well did you know this?
1
Not at all
2
3
4
5
Perfectly