Generative AI Fundamentals Flashcards

1
Q

Who created the perceptron, and when?

A

Frank Rosenblatt in 1958.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What significant AI advancement occurred in the 1980s?

A

Backpropagation was successfully used to recognize handwritten digits and mimic baby-like pronunciation of words.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Which event in 1997 demonstrated the power of AI?

A

IBM’s Deep Blue defeated Gary Kasparov in chess.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What enabled the explosion of deep learning in the 2010s?

A

Developments in image recognition, natural language processing, machine translation, and the release of the transformer model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What defines the generative AI boom in the 2020s?

A

The ability to generate realistic text and images.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are Large Language Models (LLMs)?

A

AI models designed to understand and generate human language using vast amounts of text data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are Variational Autoencoders (VAEs) used for?

A

Creating new images by encoding data into latent space and decoding it back.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Define latent space.

A

A compressed representation of data that captures its most essential features for reconstruction or generation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the difference between parameters and hyperparameters?

A

Parameters are learned during training (e.g., weights and biases), while hyperparameters are set before training (e.g., learning rate).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is autoregressive text generation?

A

A method where the AI predicts the next word in a sentence based on the previous words.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How do diffusion models generate images?

A

By starting with random noise and gradually refining it to create a realistic image.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Which generative AI model uses latent space decoding?

A

Variational Autoencoders (VAEs).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What are Generative Adversarial Networks (GANs)?

A

Models with two neural networks (generator and discriminator) that train competitively to produce realistic data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Who introduced GANs, and when?

A

Ian Goodfellow in 2014.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What are RNNs best suited for?

A

Generating sequential data like text and audio.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How do RNNs process data?

A

Step-by-step, updating a hidden state with each input.

17
Q

Why are transformer-based models dominant in text generation?

A

They process sequences in parallel, allowing efficient and scalable training.

18
Q

What distinguishes transformers from RNNs?

A

Transformers process sequences in parallel, whereas RNNs process them step-by-step.

19
Q

What are the key features of GANs, RNNs, and transformers?

A

GANs focus on realistic data creation, RNNs handle sequential data, and transformers excel in parallel processing for text tasks.

20
Q

What major dataset was released in the 2000s to support deep learning?

A

ImageNet, a large labeled image dataset.

21
Q
A