w8l1 sequence to sequence Flashcards

1
Q

what are sequence to sequence models

A

input is any sequence

output is any sequence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

neural machine translation?

A

machine translaiton using neural networks

use sequence to seuqnece models

end-to-end differentiable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

what would the formula look like for if we want to find the beset spanish setnence y, given english sentence x

A

y’ = max p(y|x,0)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

whats a diff between fnns and rnns

A

rnns loop back to the previous state to check what that was before sumbiting output

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

what do we put into our rnns

A

embeddings

the interal representation, the -> between the square

represents the sentence up to this point

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

what differens conditional language models to regular language models

A

P(y1…yn) = product P(y|y<t)

P(y1…yn |x ) = product P(y|y<t, x)

conditioned on x

if i have a question and an image, they can both be in the input and we can condition on that input

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

explain how an rnn encoder goes to classifcation

A

once you feed in all the input, take the hiden layer (vector H) and then map it using a linear layer to the number of labels

then push it through a softmax ill get a probabibility distbrution over that

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

what is an autoregressive model

A

models where information from pervious time steps is used to predict the current time step output

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

how is seq2seq optimized

A

simutaenously

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

why does back propagration called end to end

A

simutaenously teachers this part to generate what i want while encoding the other part so it becoems useful in the decoding process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly