Respuestas(aws-certified-ai-practitioner,Topic 1,11-03-2025,Paginas1-10) Flashcards

1
Q

A company wants to develop a large language model (LLM) application by using Amazon Bedrock and customer data that is uploaded to Amazon S3. The company’s security policy states that each team can access data for only the team’s own customers. Which solution will meet these requirements?

A

Create an Amazon Bedrock custom service role for each team that has access to only the team’s customer data.

Answer: D. Justification: This option allows for controlled access to customer data by creating IAM roles for each team that restrict access to their specific customer folders while using a single Bedrock role for model interactions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

An AI company periodically evaluates its systems and processes with the help of independent software vendors (ISVs). The company needs to receive email message notifications when an ISV’s compliance reports become available. Which AWS service can the company use to meet this requirement?

A

AWS Artifact

Answer: B. Justification: AWS Artifact provides access to compliance reports and allows users to receive notifications when new reports are available.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Which AWS service or feature can help an AI development team quickly deploy and consume a foundation model (FM) within the team’s VPC?

A

Amazon SageMaker JumpStart

Answer: B. Justification: Amazon SageMaker JumpStart provides pre-built solutions and models, including foundation models, that can be quickly deployed within a team’s VPC.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

A company is using an Amazon Bedrock base model to summarize documents for an internal use case. The company trained a custom model to improve the summarization quality. Which action must the company take to use the custom model through Amazon Bedrock?

A

Purchase Provisioned Throughput for the custom model.

Answer: D. Justification: To use a custom model in Amazon Bedrock, the company must grant access to the model within the Bedrock service.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

A company wants to use a large language model (LLM) on Amazon Bedrock for sentiment analysis. The company wants to classify the sentiment of text passages as positive or negative. Which prompt engineering strategy meets these requirements?

A

Provide examples of text passages with corresponding positive or negative labels in the prompt followed by the new text passage to be classified.

Answer: A. Justification: Providing examples of text passages with corresponding positive or negative labels helps the model understand the classification task effectively.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

A company uses Amazon SageMaker for its ML pipeline in a production environment. The company has large input data sizes up to 1 GB and processing times up to 1 hour. The company needs near real-time latency. Which SageMaker inference option meets these requirements?

A

Asynchronous inference

Answer: A. Justification: Real-time inference provides low-latency predictions suitable for applications requiring immediate responses, which aligns with the company’s need for near real-time latency.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

A law firm wants to build an AI application by using large language models (LLMs). The application will read legal documents and extract key points from the documents. Which solution meets these requirements?

A

Develop a summarization chatbot.

Answer: C. Justification: A summarization chatbot can effectively read legal documents and extract key points, aligning with the law firm’s requirement to summarize content.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly