w 10l1 large language models Flashcards

1
Q

what is the pretrain fine tune problem and solution

A

traditional machine learning invovled training models from scratch for every new tasks

so now we train on simple tasks then fine tune for specific tasks

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

what is the pretrain fine tune paradigm

A

● This idea of using a different (and simple) training objective
to learn linguistic priors, before then training (fine-tuning)
on the downstream task is the pre-train fine-tune paradigm.
● Notice that the pre-training task is self-supervised
○ No manual annotation is required for this

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

will language modeling lead to bias?

A

yes

because it will only store the most freuqeny asspects of the data which are limited and biased

leads to hallucination

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

how do we evaluate our llm

A

train and then test on test set with evlautaiton metric

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

in a transformer, gpt and bert which one is encorder and decorder

A

transformer encoder, decorder

gpt decorder only

bert encoder only

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Why can’t you train an Encoder-Decoder model and
“break” it?
● Encoder only architectures are used for classification:
○ BERT (Bidirectional Encoder Representations from
Transformers), RoBERTa, …
● Decoder only models are used for generation:

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

how should i phrase my input for llm

A

BAD:
● Where in London should I buy a house?
Better:
● This is the drop in interest rates, this is the increase in availability in

Based on the content provided, where should I buy a house?
Provide examples from the lecture content. [chain of thought]
Best:
● Same as above, but give an example of the kind of reasoning required
if what you want is more complex.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly