Domain 2: Gen AI Fundamentals 24% Flashcards

1
Q

____ is a subset of deep learning. Like deep learning, this is a multipurpose technology that helps to generate new original content rather than finding or classifying existing content.

A

Generative AI

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

_____ looks for statistical patterns in modalities, such as natural language and images.

A

Gen AI foundational models

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

_____ are very large and complex neural network models with billions of parameters that are learned during the training phase or pre-training.

A

Gen AI foundational models

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

The more parameters a model has, the more _____ it has, so the model can perform more advanced tasks.

A

memory

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Gen AI models are built with _____, _____, _____,and_____ all working together.

A

neural networks, system resources, data, and prompts

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

The current core element of generative AI is the _____.

A

transformer network

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

_____ are pre-trained on massive amounts of the text data from the internet, and they can use this pre-training process to build up a broad knowledge base.

A

Large Language Models (LLMs)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

A ____is a natural language text that requests the generative AI to perform a specific task.

A

prompt

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

The process of reducing the size of one model (known as the teacher) into a smaller model (known as the student) that emulates the original model’s predictions as faithfully as possible.

A

distillation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

A prompt that contains more than one example demonstrating how the large language model should respond.

A

few-shot prompting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

A second, task-specific training pass performed on a pre-trained model to refine its parameters for a specific use case.

A

fine tuning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

A form of fine-tuning that improves a generative AI model’s ability to follow instructions, which involves training a model on a series of instruction prompts, typically covering a wide variety of tasks. The resulting model generates useful responses to zero-shot prompts across a variety of tasks.

A

instruction tuning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

An algorithm for performing parameter efficient tuning that fine-tunes only a subset of a large language model’s parameters.

A

Low-Rank Adaptability (LoRA)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

A system that picks the ideal model for a specific inference query.

A

model cascading

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

The algorithm that determines the ideal model for inference in model cascading. It is typically a machine learning model that gradually learns how to pick the best model for a given input, and it could sometimes be a simpler, non-machine learning algorithm.

A

model router

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

A prompt that contains one example demonstrating how the large language model should respond. For example, the following prompt contains one example showing a large language model how it should answer a query.

A

one-shot prompting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

A set of techniques to fine-tune a large pre-trained language model (PLM) more efficiently than full fine-tuning. It typically fine-tunes far fewer parameters than full fine-tuning, yet generally produces a large language model that performs as well (or almost as well) as a large language model built from full fine-tuning.

A

parameter-efficient tuning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Models or model components (such as an embedding vector) that have already been trained. Sometimes, you’ll feed pre-trained embedding vectors into a neural network. Other times, your model will train the embedding vectors themselves rather than rely on the pre-trained embeddings.

A

pre-trained model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

The initial training of a model on a large dataset. Some models are clumsy giants and must typically be refined through additional training.

A

pre-training

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Any text entered as input to a large language model to condition the model to behave in a certain way. These can be as short as a phrase or arbitrarily long (for example, the entire text of a novel).

A

prompt

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

A capability of certain models that enables them to adapt their behavior in response to arbitrary text input (prompts). In the paradigm, a large language model responds to a prompt by generating text.

A

prompt-based learning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

The art of creating prompts that elicit the desired responses from a large language model. Humans perform. Writing well-structured prompts is an essential part of ensuring useful responses from a large language model.

A

prompt engineering

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Using feedback from human raters to improve the quality of a model’s responses. The system can then adjust its future responses based on that feedback.

A

Reinforcement Learning from Human Feedback (RLHF)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

An optional part of a prompt that identifies a target audience for a generative AI model’s response. Without it, a large language model provides an answer that may or may not be useful for the person asking the questions. With it, a large language model can answer in a way that’s more appropriate and more helpful for a specific target audience.

A

role prompting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

A technique for tuning a large language model for a particular task, without resource intensive fine-tuning. Instead of retraining all the weights in the model, this automatically adjusts a prompt to achieve the same goal.

A

soft prompt tuning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

A hyperparameter that controls the degree of randomness of a model’s output. Higher result in more random output, while lower result in less random output.

A

temperature

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

A prompt that does not provide an example of how you want the large language model to respond.

A

zero-shot prompting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

The process that a trained machine learning model uses to draw conclusions from brand-new data.

A

inference

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

When the purpose of the prompt is finished.

A

completion

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

The amount of textual information that the AI can take into account at any given time when processing language

A

context window

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

The smallest units of text that an AI model processes.

A

tokens

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

This breaks down text into these units to make it manageable for computational models.

A

tokenizer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

Providing examples inside the context window is called _____. With this, you can help LLMs learn more about the task being asked by including examples or additional data in the prompt.

A

in-context learning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

The input that you sent into your generative model is called the _____, which consists of instructions and content.

A

prompt

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

T/F: The larger a model is, the more likely it is to work without additional in-context learning or further training. Because the model’s capability increases with size, it has supported the development of larger and larger models.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

LLMs encode a deep statistical representation of a language. This understanding is developed during the _____ phase when the model learns from vast amounts of unstructured data.

A

pre-training

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

During _____, the model weights get updated to minimize the loss of the training objective, and the encoder generates an embedding or vector representation for each token.

A

pre-training

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

Why do you need to process training data collected from public sites?

A

to increase quality, address bias, or remove other harmful content.

39
Q

_____ work with one data modality. LLMs are an example of this type of generative AI because the input and the output, or completion, are text.

A

Unimodal models

40
Q

_____ is adding another modality such as image, video, or audio. These models can understand diverse data sources and can provide more robust forecasts.

A

Multimodal

41
Q

These models can process and generate multiple types of data, but they can also do this type of operation in combination with each other. This collaborative capability adds cross-model reasoning, translation, search, and creation that more closely mirrors human intelligence.

A

Multimodal models

42
Q

_____ use cases are marketing, image captioning, product design, customer service, chatbots, and avatars.

A

Multimodal generative AI

43
Q
  1. image captioning, where the model is generating text descriptions of images
  2. visual question answering, where the model answers questions about image content
  3. text to image synthesis, which is generating images from textual descriptions.
A

examples of multimodal tasks

44
Q

T/F: Models such as DALL-E, Stable Diffusion, and Midjourney can create realistic and diverse images from natural language prompts. There are also diffusion models, which support a variety of tasks for multimodal models such as image generation, upscaling, and inpainting.

A

True

45
Q

_____ are a class of generative models that learn to reverse a gradual noising process. _____-based architectures offer a higher degree of control in quality and diversity of images generated.

A

Diffusion models

46
Q

The _____ process begins by sampling from a basic, usually Gaussian, distribution. This initial simple sample undergoes a series of reversible, incremental modifications, where each step introduces a controlled amount of complexity through a Markov chain

A

forward diffusion

47
Q

The process of training a neural network to recover the original data by reversing the noising process. Denoising an image to reconstruct the original image.

A

reverse diffusion

48
Q

_____ is a deep learning model used for converting text to images. It can generate high-quality, photo-realistic images. Instead of using the pixel space of the image, it uses a reduced definition latent space.

A

Stable Diffusion

49
Q

T/F: Diffusion models tend to produce higher quality outputs with more diversity and consistency, and they’re more stable and easier to train.

A

True

50
Q

Stable Diffusion for image generation, Whisper for speech recognition and translation, and AudioLM for audio generation.

A

Examples of diffusion models

51
Q

_____ are a type of generative AI that can be applied to many different problem domains or tasks without being fine-tuned.

A

Large language models, LLMs

52
Q
  1. Generates text
  2. Text summarization
  3. Info extraction
  4. Question answering
  5. Classification
  6. Identifying harmful content
  7. Chatbots
  8. Dev tool for source code generation
A

Use cases for Gen AI

53
Q

_____ and _____ offer pre-trained models for text, image, audio generation, that could be fine-tuned for specific use cases.

A

Amazon Bedrock and Amazon Titan

54
Q

_____ and _____ formerly known as Amazon CodeWhisper, support code generation and completion.

A

SageMaker and Amazon Q Developer

55
Q

_____ and _____ offer virtual production and 3D content creation.

A

Amazon Nimble Studio and Amazon Sumerian

56
Q

_____ is an application of generative AI to accelerate software development. Models can generate functional code snippets, and even entire programs from natural language descriptions or examples, to help automate routine programming tasks, suggest code completions, and even translate code between different languages.

A

Code generation

57
Q

_____ generates real-time code suggestions, that range from snippets to full functions, based on your comments and existing code. AWS handles the underlying infrastructure, data management, model training, and inference. You can focus on your specific use cases and applications.

A

Amazon Q Developer

58
Q

A _____ is a deep learning architecture. It trains two neural networks to compete against each other to generate more authentic new data from a given training dataset. Called adversarial because it trains two different networks and pits them against each other. One network generates new data by taking an input data sample and modifying it as much as possible. The other network tries to predict whether the generated data output belongs in the original dataset.

A

generative adversarial network (GAN)

59
Q

A _____ is a type of neural network that learns to reproduce its input, and also map data to latent space.

A

variational autoencoder (VAE)

60
Q

_____ are a type of neural network architecture that transforms or changes an input sequence into an output sequence. They do this by learning context and tracking relationships between sequence components.

A

Transformers

61
Q

The _____ stage of the AI cycle is to define objectives, collect data, process data, select your model, and train and develop your model.

A

first

62
Q

The _____ stage of the AI cycle is to develop your model by using feature engineering, building, testing, validating, optimizing, and scaling.

A

second

63
Q

The _____ stage of the AI cycle to deploy and maintain, which includes model evaluation, deployment, feedback, updates, security, and scalability.

A

third

64
Q

Foundation model lifecycle

A
  1. data selection
  2. model selection
  3. pre-training
  4. fine-tuning
  5. evaluation
  6. deployment
  7. feedback
65
Q
  1. scope
  2. train model or work with existing
  3. assess model’s performance and additional training
  4. reinforcement learning from human feedback
  5. evaluation (metrics)
  6. prompt engineering then fine tune
  7. revisit and evaluate
  8. deploy
  9. consider needed infrastructure
A

Foundation model lifecycle

66
Q

What are the advantages of Gen AI?

A

adaptability, responsiveness, and simplicity

67
Q

_____ are incorrect or misleading results that AI models generate. These errors can be caused by a variety of factors, including insufficient training data, incorrect assumptions made by the model, or biases in the data used to train the model.

A

AI hallucinations

68
Q

T/F: The higher the interpretability of a machine learning model, the easier it is to comprehend the model’s predictions.

A

True

69
Q

_____ can be applied to interpret models that have low complexity or simple relationships between the input variables and the predictions. The simple relationship between the inputs, input variables, output results, and predictions in high model interpretability can lead to lower performance.

A

Intrinsic analysis

70
Q

Cross-domain performance, efficiency, conversion rate, average revenue per user, accuracy, customer lifetime value

A

business metrics for Gen AI

71
Q

_____ are trained on huge, unlabeled, broad datasets and they underpin the capabilities of generative AI. As a result, they are considerably larger than traditional machine learning models, which are generally used for more specific functions. They are used as the baseline starting point for developing and creating models, and they can be used to interpret and understand language, have conversational messaging, and create and generate images.

A

Foundation models

72
Q

AWS provides Amazon’s business metric analysis ML solution which uses _____ and _____.

A

Amazon Lookout for Metrics and Amazon Forecast

73
Q

FM output quality metrics are:

A

relevance, accuracy, coherence, and appropriateness

74
Q

Task completion rates and reduction in manual efforts making a direct contribution to operational productivity

A

Efficiency metrics

75
Q

Accuracy and credibility metrics:

A

low error rate

76
Q

Accessibility, lower barrier to entry, efficiency, cost-effectiveness, speed to market, and ability to meet business objectives.

A

advantages of using AWS generative AI services to build applications

77
Q

The process of fine-tuning and training a pre-trained model on a new dataset without training from scratch is also known as _____.

A

transfer learning

78
Q

T/F: You can accelerate the learning for some types of models by using transfer learning, then use a pre-trained model as your starting point for the training of a new dataset.

A

True

79
Q

_____ helps to find projects that are already built and are quick builds with datasets, models, algorithm types, and solutions based on industry best practices.

A

SageMaker JumpStart

80
Q

_____ has specialized hardware and associated firmware that are designed to enforce security restrictions. These restrictions are to ensure that no one can access your workloads or data running on Amazon Elastic Compute Cloud, Amazon EC2, instances. This protection applies to all Nitro-based applications and instances.

A

The AWS Nitro System

81
Q

At AWS, securing AI infrastructure means _____, such as AI model weights and data processed with those models by any unauthorized person.

A

zero access to sensitive AI data

82
Q

3 layers of Gen AI stack

A

bottom: tools for building/training LLMs and FMs
middle: access to all the models + tools to build/scale apps
top: apps that use LLMs and FMs to write/debug code, generate content, derive insights, and take actions.

83
Q

What are the three critical components of any AI system?

A

It’s input, model, and output.

84
Q

Pricing models for LLMs:

A
  1. host LLMs on your own infrastructure
  2. pay by token
85
Q

_____ are the units that vendors use to price calls in their APIs.

A

Tokens

86
Q

What’s the benefit of using pay-by-token?

A

scalability

87
Q

Why does AWS Global Infrastructure add high availability and fault tolerance?

A

It’s comprised of multiple physically separated and isolated Availability Zones (AZs) within each Region, AWS ensures redundancy and fault tolerance, safeguarding against service disruptions.

88
Q

_____ is a model hub and it helps quickly deploy foundation models that are available within the service and integrate them into your applications. It provides fine-tuning and deploying models, and also helps you get quickly into production to operate at scale.

A

SageMaker JumpStart

89
Q

_____ is a managed AWS service that lets you use and access a number of different foundation models, FMs, using APIs. It adds the capability to import custom weights for supportive model architectures, and serve the custom model by using on-demand mode. Lets you experiment by running model inference against different base foundation models that are supported within the service to help you align your use cases with the highest accuracy.

A

Amazon Bedrock

90
Q

The _____ foundation model provided by Amazon acts as a general-purpose foundation model and is a great option if you require tax generation capabilities.

A

Amazon Titan

91
Q

_____ is a playground built on Amazon Bedrock. You can use it to build generative AI applications to learn fundamental techniques, such as understanding how a foundation model responds to different prompts. Some examples of applications that you can build with this are creating playlists, trivia games, recipes, and more.

A

PartyRock

92
Q

Remember that with generative AI, you can use _____, and that data is stored as embeddings. These embeddings are vectors that can be compressed, stored and indexed for advanced searches.

A

vector databases

93
Q
A