XLM Flashcards

1
Q

XLM - What is an autoregressive model?

A

the output variable depends linearly on its own previous values and on a stochastic term (an imperfectly predictable term)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

XLM - How is autoregressive model being translated to NLP?

A

CLM - causal language modelling - predicting the next work based on the previous word.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

XLM - What is MLM?

A

Masked language modelling - predicting masked words in sentences.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

XLM - What is TLM?

A

Translation language modelling - The input is 2 sentences, that are translation to each other, one in each language.
Words are masked and the model is trying to predict them using the context from either language.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

XLM - what is their new unsupervised method for learning cross-lingual representations?

A

Pretrain the encoder and decoder separately on each language using MLM or CLM, merge them and train using back-translation interations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

XLM - What is a key for unsupervised MT

A

Pretraining

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

XLM - what is the method of back-translation taken from Lample 2018?

A

Create 2 models source->target and a target->source translations.
Create the target sentence of a source sentence, then train the target->source model. Do vice versa.
Work iteratively until saturation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly