Seq2Seq Flashcards

1
Q

What is encoder-decoder?

A

Encoder-decoder is a sequence to sequence model.
It comprises of RNN, LSTM or GRUs.

Encoder processes the input sequence by finding the contextual meaning between the words.

Decoder takes the output of encoder as a context vector and then returns the output sequence.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What was the problem with Encoder-Decoder?

A

Was not able to retain the information for longer sequences (>30).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Timeline of Seq-Seq model

A

2014 - Encoder-Decoder - Seq-to-Seq learning with Neural Network
2015 - Attention - Neural Machine Translation
2017 - Transformer - Attention is all u need

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Teacher forcing

A

Teacher forcing is a strategy for training RNNs that uses ground truth as input, instead of model output from a prior time step as input.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Improvement in Encoder-Decoder

A
  1. Using advanced embeddings technique
  2. Deep LSTMs
  3. Reversing the input
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Summarize Encoder-Decoder research paper.

A
  1. Machine translation - English to French
  2. Dataset - 12M sentences; 300+M words each language
  3. Reversed input improved results
  4. Embedding - 1,000 dimensional
  5. Deep LSTM - 4 layers
  6. Softmax output
  7. BLEU score - 34.81
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Time distributed FCN

A

Same weights are applied on each input at time t, t+1, t+2 and so on

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Bahdanau attention

A

Also known as Additive attention

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Alignment model in attention mechanism

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Luong attention

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is self attention?

A

Self attention is a mechanism that can take static embedding as a input and can generate conceptual embedding.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Alignment score

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What was the problem with attention mechanism?

A

Though, it was able to capture the long term dependency by using attention technique. But, since each words were processed sequentially, it still was computationally expensive.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Why is language modelling preferred as a pre training task?

A
  1. Rich feature learning
  2. Unsupervised task
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

How is ChatGPT trained?

A

Reinforcement Learning with Human Feedback - Human ranked responses

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q
A