w4 gans Flashcards

ooo

1
Q

whats a gan

A

generative adversarial networks are models that learn the charcteristics of a dataset and try to recreate new samples that mimic the trainning dataset

ie ai face generation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

what is gans useful for

A

generating syntehtic data for pricacy reasons and also geenrating more data if you dont have enoough data to train on

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

how do the disciminator and generative model work in a gan

A

so the generator tries to trick the discriminator into thinking the images it makes are real while the discimintor tries to sus out the fakes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

what machine learning technique/network do gans use

A

neural networks !

the hidden layers of the network represent the dimensions of the photo

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

how are generated samples packaged for the discimiantor, what is the achitecture like

A

they are mixed in with real samples and the discimotr tries to find the generated fake ones

two neural networks competing with each other

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

what kind of game do the generator and discrimintor play

A

The generator and discriminator play a zero-sum game. The generator tries to create data that fools the discriminator, while the discriminator aims to distinguish real from fake data. The equilibrium is reached when the generator produces data indistinguishable from real data, and the discriminator has a 50% accuracy.

they play a two player-minimax game where the value function is

very complicated (see slide 12) but it is the sum of the log of two different functions where one is the log of D(x) and the other one is the complement log(1-D(x))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

what does the dsicrimontators loss function consist of

A

it consist of two compoents, the estimate of the chance that the real sample is real and the estimate of the chance that the generated sample is real

it is -E[logD(x)] - E2[log(1-D(G(z)))]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

how is error calculated in the discriminators objective function for the probability that the real sample is real

A

error = -ln(prediction)

example if the label is 1 but the discimantor predicts 0.1 the error is large
-ln(0.1) = 2.3

but if the prediction is 0.9 the error is small
-ln(0.9) = 0.1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

how is error calculated in the discriminators objective function for the probability that the generated image is fake

A

error = -ln(1- prediction)

example if the label is 0 and the discimantor predicts 0.1 the error is small
-ln(1 - 0.1) = 0.1

but if the prediction is 0.9 the error is large
-ln(1 - 0.9) = 2.3

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

what are the steps to train a gan

A

Step 1: train dscriminator
1.1 Train the discriminator on real images to predict the label as real (label = 1).
1.2 Train the discriminator on fake images generated by the generator to predict the label as fake (label = 0).

step 2: train the generoate
2.1 Freeze the discriminator’s weights (stop training the discriminator).
2.2 Train the generator to produce fake images that the discriminator classifies as real (label = 1), trying to “fool” the discriminator.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

what does the loss function of the generator look like

A

-E[logD(G(z))] (generator)

-E1[logD(x)] - E2[log(1-D(G(z)))]

The first loss function directly trains the generator to fool the discriminator into classifying generated images as real (maximizing
𝐷
(
𝐺
(
𝑧
)
)
D(G(z))).
The second loss function (the traditional one) balances both the discriminator’s and generator’s objectives, encouraging the generator to generate convincing fake images while ensuring the discriminator can distinguish real from fake.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

when is nash equallibrium reached in a gan

A

when the generator makes such good data that the discrimatnor cant tell the difference anymore

thus the disciminaotr will say that there is a 50/50 chance a data is real or fake for any data

the perfomrance of both players is optimal given the strategy of the other player

How well did you know this?
1
Not at all
2
3
4
5
Perfectly