Generative AI Fundamentals Flashcards
Who created the perceptron, and when?
Frank Rosenblatt in 1958.
What significant AI advancement occurred in the 1980s?
Backpropagation was successfully used to recognize handwritten digits and mimic baby-like pronunciation of words.
Which event in 1997 demonstrated the power of AI?
IBM’s Deep Blue defeated Gary Kasparov in chess.
What enabled the explosion of deep learning in the 2010s?
Developments in image recognition, natural language processing, machine translation, and the release of the transformer model.
What defines the generative AI boom in the 2020s?
The ability to generate realistic text and images.
What are Large Language Models (LLMs)?
AI models designed to understand and generate human language using vast amounts of text data.
What are Variational Autoencoders (VAEs) used for?
Creating new images by encoding data into latent space and decoding it back.
Define latent space.
A compressed representation of data that captures its most essential features for reconstruction or generation.
What is the difference between parameters and hyperparameters?
Parameters are learned during training (e.g., weights and biases), while hyperparameters are set before training (e.g., learning rate).
What is autoregressive text generation?
A method where the AI predicts the next word in a sentence based on the previous words.
How do diffusion models generate images?
By starting with random noise and gradually refining it to create a realistic image.
Which generative AI model uses latent space decoding?
Variational Autoencoders (VAEs).
What are Generative Adversarial Networks (GANs)?
Models with two neural networks (generator and discriminator) that train competitively to produce realistic data.
Who introduced GANs, and when?
Ian Goodfellow in 2014.
What are RNNs best suited for?
Generating sequential data like text and audio.