Study at your own risk! Flashcards
These questions are from an unverified question dump. Answers may not always be correct. The purpose of these questions is to get used to the style of the questions seen on the test and to THINK CRITICALLY about the questions asked! If an answer feels wrong, look it up!
A firm is developing a web application on AWS utilizing containers. At any one moment, the organization needs three instances of the web application to be running. The application must be scalable in order to keep up with demand increases. While management is cost-conscious, they agree that the application should be highly accessible.
What recommendations should a solutions architect make?
A. Create an Amazon Elastic Container Service (Amazon ECS) cluster using the Amazon EC2 launch type with three container instances in one Availability Zone. Create a task definition for the web application.
Place one task for each container instance.
B. Create an Amazon Elastic Container Service (Amazon ECS) cluster using the Amazon EC2 launch type with one container instance in two different Availability Zones. Create a task definition for the web application. Place two tasks on one container instance and one task on the remaining container instance.
C. Create an Amazon Elastic Container Service (Amazon ECS) cluster using the Fargate launch type with one container instance in three different Availability Zones. Create a task definition for the web application. Create an ECS service with a desired count of three tasks.
D. Create an Amazon Elastic Container Service (Amazon ECS) cluster using the Fargate launch type.
Create a task definition for the web application. Create an ECS service with a desired count of three tasks.
D. Create an Amazon Elastic Container Service (Amazon ECS) cluster using the Fargate launch type.
Create a task definition for the web application. Create an ECS service with a desired count of three tasks.
A business outsources its marketplace analytics management to a third-party partner. The vendor requires restricted programmatic access to the company’s account’s resources. All necessary policies have been established to ensure acceptable access.
Which new component provides the vendor the MOST SECURE access to the account?
A. Stop the instance outside the application’s availability window. Start up the instance again when required.
B. Hibernate the instance outside the application’s availability window. Start up the instance again when required. Most Voted
C. Use Auto Scaling to scale down the instance outside the application’s availability window. Scale up the instance when required.
D. Terminate the instance outside the application’s availability window. Launch the instance by using a preconfigured Amazon Machine Image (AMI) when required.
B. Hibernate the instance outside the application’s availability window. Start up the instance again when required. Most Voted
A firm seeks to migrate its accounting system from an on-premises data center to an Amazon Web Services (AWS) Region. Data security and an unalterable audit log should be prioritized. All AWS activities must be subjected to compliance audits. Despite the fact that the business has enabled AWS CloudTrail, it want to guarantee that it meets these requirements.
What precautions and security procedures should a solutions architect include to protect and secure CloudTrail? (Choose two.)
A. Enable CloudTrail log file validation.
B. Install the CloudTrail Processing Library.
C. Enable logging of Insights events in CloudTrail.
D. Enable custom logging from the on-premises resources.
E. Create an AWS Config rule to monitor whether CloudTrail is configured to use server-side encryption with AWS KMS managed encryption keys (SSE-KMS).
A. Enable CloudTrail log file validation.
C. Enable logging of Insights events in CloudTrail.
On its website, a business keeps a searchable store of things. The data is stored in a table with over ten million rows in an Amazon RDS for MySQL database. The database is stored on a 2 TB General Purpose SSD (gp2) array. Every day, the company’s website receives millions of changes to this data. The organization found that certain activities were taking ten seconds or more and concluded that the bottleneck was the database storage performance.
Which option satisfies the performance requirement?
A. Change the storage type to Provisioned IOPS SSD (io1).
B. Change the instance to a memory-optimized instance class.
C. Change the instance to a burstable performance DB instance class.
D. Enable Multi-AZ RDS read replicas with MySQL native asynchronous replication.
A. Change the storage type to Provisioned IOPS SSD (io1).
A business that is currently hosting a web application on-premises is prepared to transition to AWS and launch a newer version of the application. The organization must route requests to the AWS or on-premises application based on the URL query string. The on-premises application is rendered unreachable over the internet, and a VPN connection is established between Amazon VPC and the business’s data center. The company wishes to deploy this application using a load balancer (ALB).
Which of the following solutions meets these criteria?
A. Use AWS Snowball Edge devices to process and store the images.
B. Upload the images to Amazon Simple Queue Service (Amazon SQS) during intermittent connectivity to EC2 instances.
C. Configure Amazon Kinesis Data Firehose to create multiple delivery streams aimed separately at the S3 buckets for storage and the EC2 instances for processing the images.
D. Use AWS Storage Gateway pre-installed on a hardware appliance to cache the images locally for Amazon S3 to process the images when connectivity becomes available.
A. Use AWS Snowball Edge devices to process and store the images.
A meteorological start-up company has created a custom web application for the aim of selling weather data to its members online. The company currently uses Amazon DynamoDB to store its data and wishes to establish a new service that alerts the managers of four internal teams whenever a new weather event is recorded. The business does not want for this new service to impair the operation of the present application.
What steps should a solutions architect take to guarantee that these objectives are satisfied with the MINIMUM feasible operational overhead?
A. Create a DynamoDB table in on-demand capacity mode.
B. Create a DynamoDB table with a global secondary Index.
C. Create a DynamoDB table with provisioned capacity and auto scaling.
D. Create a DynamoDB table in provisioned capacity mode, and configure it as a global table.
A. Create a DynamoDB table in on-demand capacity mode.
A corporation uses an AWS application to offer content to its subscribers worldwide. Numerous Amazon EC2 instances are deployed on a private subnet behind an Application Load Balancer for the application (ALB). The chief information officer (CIO) wishes to limit access to some nations due to a recent change in copyright regulations.
Which course of action will satisfy these criteria?
A. Modify the ALB security group to deny incoming traffic from blocked countries.
B. Modify the security group for EC2 instances to deny incoming traffic from blocked countries.
C. Use Amazon CloudFront to serve the application and deny access to blocked countries.
D. Use ALB listener rules to return access denied responses to incoming traffic from blocked countries.
C. Use Amazon CloudFront to serve the application and deny access to blocked countries.
Using seven Amazon EC2 instances, a business runs its web application on AWS. The organization needs that DNS queries provide the IP addresses of all healthy EC2 instances.
Which policy should be employed to comply with this stipulation?
A. Simple routing policy
B. Latency routing policy
C. Multi-value routing policy
D. Geolocation routing policy
C. Multi-value routing policy
Each day, a corporation collects data from millions of consumers totalling around 1’. The firm delivers use records for the last 12 months to its customers. To meet with regulatory and auditing standards, all use data must be retained for at least five years.
Which storage option is the MOST CHEAPEST?
A. Store the data in Amazon S3 Standard. Set a lifecycle rule to transition the data to S3 Glacier Deep Archive after 1 year. Set a lifecycle rule to delete the data after 5 years.
B. Store the data in Amazon S3 One Zone-Infrequent Access (S3 One Zone-IA). Set a lifecycle rule to transition the data to S3 Glacier after 1 year. Set the lifecycle rule to delete the data after 5 years.
C. Store the data in Amazon S3 Standard. Set a lifecycle rule to transition the data to S3 Standard-Infrequent Access (S3 Standard-IA) after 1 year. Set a lifecycle rule to delete the data after 5 years.
D. Store the data in Amazon S3 Standard. Set a lifecycle rule to transition the data to S3 One Zone-Infrequent Access (S3 One Zone-IA) after 1 year. Set a lifecycle rule to delete the data after 5 years.
A. Store the data in Amazon S3 Standard. Set a lifecycle rule to transition the data to S3 Glacier Deep Archive after 1 year. Set a lifecycle rule to delete the data after 5 years.
A business uses an Amazon RDS for PostgreSQL database instance to manage a fleet of web servers. Following a normal compliance review, the corporation establishes a standard requiring all production databases to have a recovery point objective (RPO) of less than one second.
Which solution satisfies these criteria?
A. Enable a Multi-AZ deployment for the DB instance.
B. Enable auto scaling for the DB instance in one Availability Zone.
C. Configure the DB instance in one Availability Zone, and create multiple read replicas in a separate Availability Zone.
D. Configure the DB instance in one Availability Zone, and configure AWS Database Migration Service (AWS DMS) change data capture (CDC) tasks.
A. Enable a Multi-AZ deployment for the DB instance.
On Amazon EC2 instances, a business is developing an application that creates transitory transactional data. Access to data storage that can deliver adjustable and consistent IOPS is required by the application.
What recommendations should a solutions architect make?
A. Provision an EC2 instance with a Throughput Optimized HDD (st1) root volume and a Cold HDD (sc1) data volume.
B. Provision an EC2 instance with a Throughput Optimized HDD (st1) volume that will serve as the root and data volume.
C. Provision an EC2 instance with a General Purpose SSD (gp2) root volume and Provisioned IOPS SSD (io1) data volume.
D. Provision an EC2 instance with a General Purpose SSD (gp2) root volume. Configure the application to store its data in an Amazon S3 bucket.
C. Provision an EC2 instance with a General Purpose SSD (gp2) root volume and Provisioned IOPS SSD (io1) data volume.
Prior to implementing a new workload, a solutions architect must examine and update the company’s current IAM rules. The following policy was written by the solutions architect:
{ "Version": "2012-10-17", "Statement": [{ "Effect": "Deny", "NotAction": "s3:PutObject", "Resource": "*" "Condition": {"BoolIfExists": {"aws:MultiFactorAuthPresent: "false"}} }] }
What is the policy’s net effect?
A. Users will be allowed all actions except s3:PutObject if multi-factor authentication (MFA) is enabled.
B. Users will be allowed all actions except s3:PutObject if multi-factor authentication (MFA) is not enabled.
C. Users will be denied all actions except s3:PutObject if multi-factor authentication (MFA) is enabled.
D. Users will be denied all actions except s3:PutObject if multi-factor authentication (MFA) is not enabled.
D. Users will be denied all actions except s3:PutObject if multi-factor authentication (MFA) is not enabled.
To allow near-real-time processing, a web application must persist order data to Amazon S3. A solutions architect must design a scalable and fault-tolerant architecture.
Which solutions satisfy these criteria? (Select two.)
A. Write the order event to an Amazon DynamoDB table. Use DynamoDB Streams to trigger an AWS Lambda function that parses the payload and writes the data to Amazon S3.
B. Write the order event to an Amazon Simple Queue Service (Amazon SQS) queue. Use the queue to trigger an AWSLambda function that parsers the payload and writes the data to Amazon S3.
C. Write the order event to an Amazon Simple Notification Service (Amazon SNS) topic. Use the SNS topic to trigger an AWS Lambda function that parses the payload and writes the data to Amazon S3.
D. Write the order event to an Amazon Simple Queue Service (Amazon SQS) queue. Use an Amazon EventBridge (Amazon CloudWatch Events) rule to trigger an AWS Lambda function that parses the payload and writes the data to Amazon S3.
E. Write the order event to an Amazon Simple Notification Service (Amazon SNS) topic. Use an Amazon EventBridge (Amazon CloudWatch Events) rule to trigger an AWS Lambda function that parses the payload andwrites the data to Amazon S3.
A. Write the order event to an Amazon DynamoDB table. Use DynamoDB Streams to trigger an AWS Lambda function that parses the payload and writes the data to Amazon S3.
C. Write the order event to an Amazon Simple Notification Service (Amazon SNS) topic. Use the SNS topic to trigger an AWS Lambda function that parses the payload and writes the data to Amazon S3.
A business in the us-east-1 region offers a picture hosting service. Users from many countries may upload and browse images using the program. Some photographs get a high volume of views over months, while others receive a low volume of views for less than a week. The program supports picture uploads of up to 20 MB in size. The service determines which photographs to show to each user based on the photo information.
Which option delivers the most cost-effective access to the suitable users?
A. Store the photos in Amazon DynamoDB. Turn on DynamoDB Accelerator (DAX) to cache frequently viewed items.
B. Store the photos in the Amazon S3 Intelligent-Tiering storage class. Store the photo metadata and its S3 location in DynamoDB.
C. Store the photos in the Amazon S3 Standard storage class. Set up an S3 Lifecycle policy to move photos older than 30 days to the S3 Standard-Infrequent Access (S3 Standard-IA) storage class. Use the object tags to keep track of metadata.
D. Store the photos in the Amazon S3 Glacier storage class. Set up an S3 Lifecycle policy to move photos older than 30 days to the S3 Glacier Deep Archive storage class. Store the photo metadata and its S3 location in Amazon Elasticsearch Service (Amazon ES).
B. Store the photos in the Amazon S3 Intelligent-Tiering storage class. Store the photo metadata and its S3 location in DynamoDB.
A business is creating a website that will store static photos in an Amazon S3 bucket. The company’s goal is to reduce both latency and cost for all future requests.
How should a solutions architect propose a service configuration?
A. Deploy a NAT server in front of Amazon S3.
B. Deploy Amazon CloudFront in front of Amazon S3.
C. Deploy a Network Load Balancer in front of Amazon S3.
D. Configure Auto Scaling to automatically adjust the capacity of the website.
B. Deploy Amazon CloudFront in front of Amazon S3.
For the database layer of its ecommerce website, a firm uses Amazon DynamoDB with provided throughput. During flash sales, clients may encounter periods of delay when the database is unable to manage the volume of transactions. As a result, the business loses transactions. The database operates normally during regular times.
Which approach resolves the company’s performance issue?
A. Switch DynamoDB to on-demand mode during flash sales.
B. Implement DynamoDB Accelerator for fast in memory performance.
C. Use Amazon Kinesis to queue transactions for processing to DynamoDB.
D. Use Amazon Simple Queue Service (Amazon SQS) to queue transactions to DynamoDB.
B. Implement DynamoDB Accelerator for fast in memory performance
A significant media corporation uses AWS to host a web application. The corporation intends to begin caching secret media files in order to provide dependable access to them to consumers worldwide. Amazon S3 buckets are used to store the material. The organization must supply material rapidly, regardless of the origin of the requests.
Which solution will satisfy these criteria?
A. Use AWS DataSync to connect the S3 buckets to the web application.
B. Deploy AWS Global Accelerator to connect the S3 buckets to the web application.
C. Deploy Amazon CloudFront to connect the S3 buckets to CloudFront edge servers.
D. Use Amazon Simple Queue Service (Amazon SQS) to connect the S3 buckets to the web application.
C. Deploy Amazon CloudFront to connect the S3 buckets to CloudFront edge servers.
In the AWS Cloud, a web application is deployed. It is a two-tier design comprised of a web and database layer. Cross-site scripting (XSS) attacks are possible on the web server.
What is the best course of action for a solutions architect to take to address the vulnerability?
A. Create a Classic Load Balancer. Put the web layer behind the load balancer and enable AWS WAF.
B. Create a Network Load Balancer. Put the web layer behind the load balancer and enable AWS WAF.
C. Create an Application Load Balancer. Put the web layer behind the load balancer and enable AWS WAF.
D. Create an Application Load Balancer. Put the web layer behind the load balancer and use AWS Shield Standard.
C. Create an Application Load Balancer. Put the web layer behind the load balancer and enable AWS WAF.
A business is prepared to use Amazon S3 to store sensitive data. Data must be encrypted at rest for compliance purposes. Auditing of encryption key use is required. Each year, keys must be rotated.
Which solution satisfies these parameters and is the MOST OPTIMAL in terms of operational efficiency?
A. Server-side encryption with customer-provided keys (SSE-C)
B. Server-side encryption with Amazon S3 managed keys (SSE-S3)
C. Server-side encryption with AWS KMS (SSE-KMS) customer master keys (CMKs) with manual rotation
D. Server-side encryption with AWS KMS (SSE-KMS) customer master keys (CMKs) with automatic rotation
D. Server-side encryption with AWS KMS (SSE-KMS) customer master keys (CMKs) with automatic rotation
Management needs a summary of AWS billed items broken down by user as part of their budget planning process. Budgets for departments will be created using the data. A solutions architect must ascertain the most effective method of obtaining this report data.
Which solution satisfies these criteria?
A. Run a query with Amazon Athena to generate the report.
B. Create a report in Cost Explorer and download the report.
C. Access the bill details from the billing dashboard and download the bill.
D. Modify a cost budget in AWS Budgets to alert with Amazon Simple Email Service (Amazon SES).
B. Create a report in Cost Explorer and download the report.
A solutions architect must create a system for archiving client case files. The files are critical corporate assets. The file count will increase over time.
Multiple application servers running on Amazon EC2 instances must be able to access the files concurrently. There must be built-in redundancy in the solution.
Which solution satisfies these criteria?
A. Amazon Elastic File System (Amazon EFS)
B. Amazon Elastic Block Store (Amazon EBS)
C. Amazon S3 Glacier Deep Archive
D. AWS Backup
A. Amazon Elastic File System (Amazon EFS)
A business must give secure access to secret and sensitive data to its workers. The firm want to guarantee that only authorized individuals have access to the data. The data must be safely downloaded to the workers’ devices.
The files are kept on a Windows file server on-premises. However, as remote traffic increases, the file server’s capacity is being depleted.
Which solution will satisfy these criteria?
A. Migrate the file server to an Amazon EC2 instance in a public subnet. Configure the security group to limit inbound traffic to the employees’ IP addresses.
B. Migrate the files to an Amazon FSx for Windows File Server file system. Integrate the Amazon FSx file system with the on-premises Active Directory. Configure AWS Client VPN.
C. Migrate the files to Amazon S3, and create a private VPC endpoint. Create a signed URL to allow download.
D. Migrate the files to Amazon S3, and create a public VPC endpoint. Allow employees to sign on with AWS Single Sign-On.
B. Migrate the files to an Amazon FSx for Windows File Server file system. Integrate the Amazon FSx file system with the on-premises Active Directory. Configure AWS Client VPN.
A legal company must communicate with the public. Hundreds of files must be publicly accessible. Anyone is banned from modifying or deleting the files before to a specified future date.
Which solution satisfies these criteria the SAFEST way possible?
A. Upload all flies to an Amazon S3 bucket that is configured for static website hosting. Grant read-only IAM permissions to any AWS principals that access the S3 bucket until the designated date.
B. Create a new Amazon S3 bucket with S3 Versioning enabled. Use S3 Object Lock with a retention period in accordance with the designated date. Configure the S3 bucket for static website hosting. Set an S3 bucket policy to allow read-only access to the objects.
C. Create a new Amazon S3 bucket with S3 Versioning enabled. Configure an event trigger to run an AWS Lambda function in case of object modification or deletion. Configure the Lambda function to replace the objects with the original versions from a private S3 bucket.
D. Upload all files to an Amazon S3 bucket that is configured for static website hosting. Select the folder that contains the files. Use S3 Object Lock with a retention period in accordance with the designated date. Grant read-only IAM permissions to any AWS principals that access the S3 bucket.
B. Create a new Amazon S3 bucket with S3 Versioning enabled. Use S3 Object Lock with a retention period in accordance with the designated date. Configure the S3 bucket for static website hosting. Set an S3 bucket policy to allow read-only access to the objects.
A corporation connects its on-premises servers to AWS through a 10 Gbps AWS Direct Connect connection. The connection’s workloads are crucial. The organization needs a catastrophe recovery approach that is as resilient as possible while minimizing the existing connection bandwidth.
What recommendations should a solutions architect make?
A. Set up a new Direct Connect connection in another AWS Region.
B. Set up a new AWS managed VPN connection in another AWS Region.
C. Set up two new Direct Connect connections: one in the current AWS Region and one in another Region.
D. Set up two new AWS managed VPN connections: one in the current AWS Region and one in another Region.
C. Set up two new Direct Connect connections: one in the current AWS Region and one in another Region.
A business has two virtual private clouds (VPCs) labeled Management and Production. The Management VPC connects to a single device in the data center using VPNs via a customer gateway. The Production VPC is connected to AWS through two AWS Direct Connect connections via a virtual private gateway. Both the Management and Production VPCs communicate with one another through a single VPC peering connection.
What should a solutions architect do to minimize the architecture’s single point of failure?
A. Add a set of VPNs between the Management and Production VPCs.
B. Add a second virtual private gateway and attach it to the Management VPC.
C. Add a second set of VPNs to the Management VPC from a second customer gateway device.
D. Add a second VPC peering connection between the Management VPC and the Production VPC.
C. Add a second set of VPNs to the Management VPC from a second customer gateway device.
AWS hosts a company’s near-real-time streaming application. While the data is being ingested, a job is being performed on it that takes 30 minutes to finish. Due to the massive volume of incoming data, the workload regularly faces significant latency. To optimize performance, a solutions architect must build a scalable and serverless system.
Which actions should the solutions architect do in combination? (Select two.)
A. Use Amazon Kinesis Data Firehose to ingest the data.
B. Use AWS Lambda with AWS Step Functions to process the data.
C. Use AWS Database Migration Service (AWS DMS) to ingest the data.
D. Use Amazon EC2 instances in an Auto Scaling group to process the data.
E. Use AWS Fargate with Amazon Elastic Container Service (Amazon ECS) to process the data.
A. Use Amazon Kinesis Data Firehose to ingest the data.
E. Use AWS Fargate with Amazon Elastic Container Service (Amazon ECS) to process the data.
Amazon Elastic Block Store (Amazon EBS) volumes are used by a media organization to store video material. A certain video file has gained popularity, and a significant number of individuals from all over the globe are now viewing it. As a consequence, costs have increased.
Which step will result in a cost reduction without jeopardizing user accessibility?
A. Change the EBS volume to Provisioned IOPS (PIOPS).
B. Store the video in an Amazon S3 bucket and create an Amazon CloudFront distribution.
C. Split the video into multiple, smaller segments so users are routed to the requested video segments only.
D. Clear an Amazon S3 bucket in each Region and upload the videos so users are routed to the nearest S3 bucket.
B. Store the video in an Amazon S3 bucket and create an Amazon CloudFront distribution.
Amazon S3 buckets are used by an image hosting firm to store its objects. The firm wishes to prevent unintentional public disclosure of the items contained in the S3 buckets. All S3 items in the AWS account as a whole must remain private.
Which solution will satisfy these criteria?
A. Use Amazon GuardDuty to monitor S3 bucket policies. Create an automatic remediation action rule that uses an AWS Lambda function to remediate any change that makes the objects public.
B. Use AWS Trusted Advisor to find publicly accessible S3 buckets. Configure email notifications in Trusted Advisor when a change is detected. Manually change the S3 bucket policy if it allows public access.
C. Use AWS Resource Access Manager to find publicly accessible S3 buckets. Use Amazon Simple Notification Service (Amazon SNS) to invoke an AWS Lambda function when a change is detected. Deploy a Lambda function that programmatically remediates the change.
D. Use the S3 Block Public Access feature on the account level. Use AWS Organizations to create a service control policy (SCP) that prevents IAM users from changing the setting. Apply the SCP to the account.
D. Use the S3 Block Public Access feature on the account level. Use AWS Organizations to create a service control policy (SCP) that prevents IAM users from changing the setting. Apply the SCP to the account.
A company’s website stores transactional data on an Amazon RDS MySQL Multi-AZ DB instance. Other internal systems query this database instance to get data for batch processing. When internal systems request data from the RDS DB instance, the RDS DB instance drastically slows down. This has an effect on the website’s read and write performance, resulting in poor response times for users.
Which approach will result in an increase in website performance?
A. Use an RDS PostgreSQL DB instance instead of a MySQL database.
B. Use Amazon ElastiCache to cache the query responses for the website.
C. Add an additional Availability Zone to the current RDS MySQL Multi-AZ DB instance.
D. Add a read replica to the RDS DB instance and configure the internal systems to query the read replica.
D. Add a read replica to the RDS DB instance and configure the internal systems to query the read replica.
Currently, a company’s legacy application relies on an unencrypted Amazon RDS MySQL database with a single instance. All current and new data in this database must be encrypted to comply with new compliance standards.
How is this to be achieved?
A. Create an Amazon S3 bucket with server-side encryption enabled. Move all the data to Amazon S3. Delete the RDS instance.
B. Enable RDS Multi-AZ mode with encryption at rest enabled. Perform a failover to the standby instance to delete the original instance.
C. Take a Snapshot of the RDS instance. Create an encrypted copy of the snapshot. Restore the RDS instance from the encrypted snapshot.
D. Create an RDS read replica with encryption at rest enabled. Promote the read replica to master and switch the application over to the new master. Delete the old RDS instance.
C. Take a Snapshot of the RDS instance. Create an encrypted copy of the snapshot. Restore the RDS instance from the encrypted snapshot.
A marketing firm uses an Amazon S3 bucket to store CSV data for statistical research. Permission is required for an application running on an Amazon EC2 instance to properly handle the CSV data stored in the S3 bucket.
Which step will provide the MOST SECURE access to the S3 bucket for the EC2 instance?
A. Attach a resource-based policy to the S3 bucket.
B. Create an IAM user for the application with specific permissions to the S3 bucket.
C. Associate an IAM role with least privilege permissions to the EC2 instance profile.
D. Store AWS credentials directly on the EC2 instance for applications on the instance to use for API calls.
C. Associate an IAM role with least privilege permissions to the EC2 instance profile.
On a cluster of Amazon Linux EC2 instances, a business runs an application. The organization is required to store all application log files for seven years for compliance purposes.
The log files will be evaluated by a reporting program, which will need concurrent access to all files.
Which storage system best satisfies these criteria in terms of cost-effectiveness?
A. Amazon Elastic Block Store (Amazon EBS)
B. Amazon Elastic File System (Amazon EFS)
C. Amazon EC2 instance store
D. Amazon S3
D. Amazon S3
On a fleet of Amazon EC2 instances, a business provides a training site. The business predicts that when its new course, which includes hundreds of training videos on the web, is available in one week, it will be tremendously popular.
What should a solutions architect do to ensure that the predicted server load is kept to a minimum?
A. Store the videos in Amazon ElastiCache for Redis. Update the web servers to serve the videos using the ElastiCache API.
B. Store the videos in Amazon Elastic File System (Amazon EFS). Create a user data script for the web servers to mount the EFS volume.
C. Store the videos in an Amazon S3 bucket. Create an Amazon CloudFront distribution with an origin access identity (OAI) of that S3 bucket. Restrict Amazon S3 access to the OAI.
D. Store the videos in an Amazon S3 bucket. Create an AWS Storage Gateway file gateway to access the S3 bucket. Create a user data script for the web servers to mount the file gateway.
C. Store the videos in an Amazon S3 bucket. Create an Amazon CloudFront distribution with an origin access identity (OAI) of that S3 bucket. Restrict Amazon S3 access to the OAI.
A business chooses to transition from on-premises to the AWS Cloud its three-tier web application. The new database must be able to scale storage capacity dynamically and conduct table joins.
Which AWS service satisfies these criteria?
A. Amazon Aurora
B. Amazon RDS for SqlServer
C. Amazon DynamoDB Streams
D. Amazon DynamoDB on-demand
A. Amazon Aurora
On a fleet of Amazon EC2 instances, a business runs a production application. The program takes data from an Amazon SQS queue and concurrently processes the messages. The message volume is variable, and traffic is often interrupted. This program should handle messages continuously and without interruption.
Which option best fits these criteria in terms of cost-effectiveness?
A. Use Spot Instances exclusively to handle the maximum capacity required.
B. Use Reserved Instances exclusively to handle the maximum capacity required.
C. Use Reserved Instances for the baseline capacity and use Spot Instances to handle additional capacity.
D. Use Reserved Instances for the baseline capacity and use On-Demand Instances to handle additional capacity.
D. Use Reserved Instances for the baseline capacity and use On-Demand Instances to handle additional capacity.
A startup has developed an application that gathers data from Internet of Things (IoT) sensors installed on autos. Through Amazon Kinesis Data Firehose, the data is transmitted to and stored in Amazon S3. Each year, data generates billions of S3 objects. Each morning, the business retrains a set of machine learning (ML) models using data from the preceding 30 days.
Four times a year, the corporation analyzes and trains other machine learning models using data from the preceding 12 months. The data must be accessible with a minimum of delay for a period of up to one year. Data must be preserved for archive reasons after one year.
Which storage system best satisfies these criteria in terms of cost-effectiveness?
A. Use the S3 Intelligent-Tiering storage class. Create an S3 Lifecycle policy to transition objects to S3 Glacier Deep Archive after 1 year.
B. Use the S3 Intelligent-Tiering storage class. Configure S3 Intelligent-Tiering to automatically move objects to S3 Glacier Deep Archive after 1 year.
C. Use the S3 Standard-Infrequent Access (S3 Standard-IA) storage class. Create an S3 Lifecycle policy to transition objects to S3 Glacier Deep Archive after 1 year.
D. Use the S3 Standard storage class. Create an S3 Lifecycle policy to transition objects to S3 Standard-Infrequent Access (S3 Standard-IA) after 30 days, and then to S3 Glacier Deep Archive after 1 year.
D. Use the S3 Standard storage class. Create an S3 Lifecycle policy to transition objects to S3 Standard-Infrequent Access (S3 Standard-IA) after 30 days, and then to S3 Glacier Deep Archive after 1 year.
A business requires data storage on Amazon S3. A compliance requirement stipulates that when objects are modified, their original state must be retained. Additionally, data older than five years should be kept for auditing purposes.
What SHOULD A SOLUTIONS ARCHITECT RECOMMEND AS THE MOST EFFORTABLE?
A. Enable object-level versioning and S3 Object Lock in governance mode
B. Enable object-level versioning and S3 Object Lock in compliance mode
C. Enable object-level versioning. Enable a lifecycle policy to move data older than 5 years to S3 Glacier Deep Archive
D. Enable object-level versioning. Enable a lifecycle policy to move data older than 5 years to S3 Standard-Infrequent Access (S3 Standard-IA)
C. Enable object-level versioning. Enable a lifecycle policy to move data older than 5 years to S3 Glacier Deep Archive
Multiple Amazon EC2 instances are used to host an application. The program reads messages from an Amazon SQS queue, writes them to an Amazon RDS database, and then removes them from the queue. The RDS table sometimes contains duplicate entries. There are no duplicate messages in the SQS queue.
How can a solutions architect guarantee that messages are handled just once?
A. Use the CreateQueue API call to create a new queue.
B. Use the AddPermission API call to add appropriate permissions.
C. Use the ReceiveMessage API call to set an appropriate wait time.
D. Use the ChangeMessageVisibility API call to increase the visibility timeout.
D. Use the ChangeMessageVisibility API call to increase the visibility timeout.
A corporation just announced the worldwide launch of their retail website. The website is hosted on numerous Amazon EC2 instances, which are routed via an Elastic Load Balancer. The instances are distributed across several Availability Zones in an Auto Scaling group.
The firm want to give its clients with customized material depending on the device from which they view the website.
Which steps should a solutions architect perform in combination to satisfy these requirements? (Select two.)
A. Configure Amazon CloudFront to cache multiple versions of the content.
B. Configure a host header in a Network Load Balancer to forward traffic to different instances.
C. Configure a Lambda@Edge function to send specific objects to users based on the User-Agent header.
D. Configure AWS Global Accelerator. Forward requests to a Network Load Balancer (NLB). Configure the NLB to set up host-based routing to different EC2 instances.
E. Configure AWS Global Accelerator. Forward requests to a Network Load Balancer (NLB). Configure the NLB to set up path-based routing to different EC2 instances.
A. Configure Amazon CloudFront to cache multiple versions of the content.
C. Configure a Lambda@Edge function to send specific objects to users based on the User-Agent header.
To facilitate experimentation and agility, a business enables developers to link current IAM policies to existing IAM roles. The security operations team, on the other hand, is worried that the developers may attach the current administrator policy, allowing them to bypass any other security rules.
What approach should a solutions architect use in dealing with this issue?
A. Create an Amazon SNS topic to send an alert every time a developer creates a new policy.
B. Use service control policies to disable IAM activity across all account in the organizational unit.
C. Prevent the developers from attaching any policies and assign all IAM duties to the security operations team.
D. Set an IAM permissions boundary on the developer IAM role that explicitly denies attaching the administrator policy.
D. Set an IAM permissions boundary on the developer IAM role that explicitly denies attaching the administrator policy.
A newly formed company developed a three-tiered web application. The front end is comprised entirely of static information. Microservices form the application layer. User data is kept in the form of JSON documents that must be accessible with a minimum of delay. The firm anticipates minimal regular traffic in the first year, with monthly traffic spikes. The startup team’s operational overhead expenditures must be kept to a minimum.
What should a solutions architect suggest as a means of achieving this?
A. Use Amazon S3 static website hosting to store and serve the front end. Use AWS Elastic Beanstalk for the application layer. Use Amazon DynamoDB to store user data.
B. Use Amazon S3 static website hosting to store and serve the front end. Use Amazon Elastic KubernetesService (Amazon EKS) for the application layer. Use Amazon DynamoDB to store user data.
C. Use Amazon S3 static website hosting to store and serve the front end. Use Amazon API Gateway and AWS Lambda functions for the application layer. Use Amazon DynamoDB to store user data.
D. Use Amazon S3 static website hosting to store and serve the front end. Use Amazon API Gateway and AWS Lambda functions for the application layer. Use Amazon RDS with read replicas to store user data.
C. Use Amazon S3 static website hosting to store and serve the front end. Use Amazon API Gateway and AWS Lambda functions for the application layer. Use Amazon DynamoDB to store user data.
Amazon Elastic Container Service (Amazon ECS) container instances are used to install an ecommerce website’s web application behind an Application Load Balancer (ALB). The website slows down and availability is decreased during moments of heavy usage. A solutions architect utilizes Amazon CloudWatch alarms to be notified when an availability problem occurs, allowing them to scale out resources. The management of the business want a system that automatically reacts to such circumstances.
Which solution satisfies these criteria?
A. Set up AWS Auto Scaling to scale out the ECS service when there are timeouts on the ALB. Set up AWS Auto Scaling to scale out the ECS cluster when the CPU or memory reservation is too high.
B. Set up AWS Auto Scaling to scale out the ECS service when the ALB CPU utilization is too high. Setup AWS Auto Scaling to scale out the ECS cluster when the CPU or memory reservation is too high.
C. Set up AWS Auto Scaling to scale out the ECS service when the service’s CPU utilization is too high. Set up AWS Auto Scaling to scale out the ECS cluster when the CPU or memory reservation is too high.
D. Set up AWS Auto Scaling to scale out the ECS service when the ALB target group CPU utilization is too high. Set up AWS Auto Scaling to scale out the ECS cluster when the CPU or memory reservation is too high.
C. Set up AWS Auto Scaling to scale out the ECS service when the service’s CPU utilization is too high. Set up AWS Auto Scaling to scale out the ECS cluster when the CPU or memory reservation is too high.
A business uses Site-to-Site VPN connections to provide safe access to AWS Cloud services from on-premises. Users are experiencing slower VPN connectivity as a result of increased traffic through the VPN connections to the Amazon EC2 instances.
Which approach will result in an increase in VPN throughput?
A. Use a transit gateway with equal cost multipath routing and add additional VPN tunnels
B. Configure a virtual private gateway with equal cost multipath routing and multiple channels
C. Implement multiple customer gateways for the same network to scale the throughput
D. Increase the number of tunnels in the VPN configuration to scale the throughput beyond the default limit
C. Implement multiple customer gateways for the same network to scale the throughput
On Amazon EC2 Linux instances, a business hosts a website. Several of the examples are malfunctioning. The troubleshooting indicates that the unsuccessful instances lack swap space. The operations team’s lead need a monitoring solution for this.
What recommendations should a solutions architect make?
A. Configure an Amazon CloudWatch SwapUsage metric dimension. Monitor the SwapUsage dimension in the EC2 metrics in CloudWatch.
B. Use EC2 metadata to collect information, then publish it to Amazon CloudWatch custom metrics. Monitor SwapUsage metrics in CloudWatch.
C. Install an Amazon CloudWatch agent on the instances. Run an appropriate script on a set schedule. Monitor SwapUtilization metrics in CloudWatch.
D. Enable detailed monitoring in the EC2 console. Create an Amazon CloudWatch SwapUtilization custom metric. Monitor SwapUtilization metrics in CloudWatch.
C. Install an Amazon CloudWatch agent on the instances. Run an appropriate script on a set schedule. Monitor SwapUtilization metrics in CloudWatch.
AWS is used by a business to perform an online transaction processing (OLTP) burden. This workload is deployed in a Multi-AZ environment using an unencrypted Amazon RDS database instance. This instance’s database is backed up daily.
What should a solutions architect do going forward to guarantee that the database and snapshots are constantly encrypted?
A. Encrypt a copy of the latest DB snapshot. Replace existing DB instance by restoring the encrypted snapshot.
B. Create a new encrypted Amazon Elastic Block Store (Amazon EBS) volume and copy the snapshots to it. Enable encryption on the DB instance.
C. Copy the snapshots and enable encryption using AWS Key Management Service (AWS KMS). Restore encrypted snapshot to an existing DB instance.
D. Copy the snapshots to an Amazon S3 bucket that is encrypted using server-side encryption with AWS Key Management Service (AWS KMS) managed keys (SSE-KMS).
A. Encrypt a copy of the latest DB snapshot. Replace existing DB instance by restoring the encrypted snapshot.
A business operates an application that collects data from its consumers through various Amazon EC2 instances. After processing, the data is uploaded to Amazon S3 for long-term storage. A study of the application reveals that the EC2 instances were inactive for extended periods of time. A solutions architect must provide a system that maximizes usage while minimizing expenditures.
Which solution satisfies these criteria?
A. Use Amazon EC2 in an Auto Scaling group with On-Demand instances.
B. Build the application to use Amazon Lightsail with On-Demand Instances.
C. Create an Amazon CloudWatch cron job to automatically stop the EC2 instances when there is no activity.
D. Redesign the application to use an event-driven design with Amazon Simple Queue Service (Amazon SQS) and AWS Lambda.
A. Use Amazon EC2 in an Auto Scaling group with On-Demand instances.
A business wishes to migrate from many independent Amazon Web Services accounts to a consolidated, multi-account design. The organization intends to generate a large number of new AWS accounts for its business divisions. The organization must use a single corporate directory service to authenticate access to these AWS accounts.
Which steps should a solutions architect advocate in order to satisfy these requirements? (Select two.)
A. Create a new organization in AWS Organizations with all features turned on. Create the new AWS accounts in the organization.
B. Set up an Amazon Cognito identity pool. Configure AWS Single Sign-On to accept Amazon Cognito authentication.
C. Configure a service control policy (SCP) to manage the AWS accounts. Add AWS Single Sign-On to AWS Directory Service.
D. Create a new organization in AWS Organizations. Configure the organization’s authentication mechanism to use AWS Directory Service directly.
E. Set up AWS Single Sign-On (AWS SSO) in the organization. Configure AWS SSO, and integrate it with the company’s corporate directory service.
D. Create a new organization in AWS Organizations. Configure the organization’s authentication mechanism to use AWS Directory Service directly.
E. Set up AWS Single Sign-On (AWS SSO) in the organization. Configure AWS SSO, and integrate it with the company’s corporate directory service.
A solutions architect is developing a daily data processing task that will take up to two hours to finish. If the task is stopped, it must be restarted from scratch.
What is the MOST cost-effective way for the solutions architect to solve this issue?
A. Create a script that runs locally on an Amazon EC2 Reserved Instance that is triggered by a cron job.
B. Create an AWS Lambda function triggered by an Amazon EventBridge (Amazon CloudWatch Events) scheduled event.
C. Use an Amazon Elastic Container Service (Amazon ECS) Fargate task triggered by an Amazon EventBridge (Amazon CloudWatch Events) scheduled event.
D. Use an Amazon Elastic Container Service (Amazon ECS) task running on Amazon EC2 triggered by an Amazon EventBridge (Amazon CloudWatch Events) scheduled event.
C. Use an Amazon Elastic Container Service (Amazon ECS) Fargate task triggered by an Amazon EventBridge (Amazon CloudWatch Events) scheduled event.
A business intends to use AWS to host a survey website. The firm anticipated a high volume of traffic. As a consequence of this traffic, the database is updated asynchronously. The organization want to avoid dropping writes to the database housed on AWS.
How should the business’s application be written to handle these database requests?
A. Configure the application to publish to an Amazon Simple Notification Service (Amazon SNS) topic. Subscribe the database to the SNS topic.
B. Configure the application to subscribe to an Amazon Simple Notification Service (Amazon SNS) topic. Publish the database updates to the SNS topic.
C. Use Amazon Simple Queue Service (Amazon SQS) FIFO queues to queue the database connection until the database has resources to write the data.
D. Use Amazon Simple Queue Service (Amazon SQS) FIFO queues for capturing the writes and draining the queue as each write is made to the database.
D. Use Amazon Simple Queue Service (Amazon SQS) FIFO queues for capturing the writes and draining the queue as each write is made to the database.
On a huge fleet of Amazon EC2 instances, a business runs an application. The program reads and writes items to a DynamoDB database hosted by Amazon. The DynamoDB database increases in size regularly, yet the application requires just data from the previous 30 days. The organization need a solution that is both cost effective and time efficient to implement.
Which solution satisfies these criteria?
A. Use an AWS CloudFormation template to deploy the complete solution. Redeploy the CloudFormation stack every 30 days, and delete the original stack.
B. Use an EC2 instance that runs a monitoring application from AWS Marketplace. Configure the monitoring application to use Amazon DynamoDB Streams to store the timestamp when a new item is created in the table. Use a script that runs on the EC2 instance to delete items that have a timestamp that is older than 30 days.
C. Configure Amazon DynamoDB Streams to invoke an AWS Lambda function when a new item is created in the table. Configure the Lambda function to delete items in the table that are older than 30 days.
D. Extend the application to add an attribute that has a value of the current timestamp plus 30 days to each new item that is created in the table. Configure DynamoDB to use the attribute as the TTL attribute.
D. Extend the application to add an attribute that has a value of the current timestamp plus 30 days to each new item that is created in the table. Configure DynamoDB to use the attribute as the TTL attribute.
Previously, a corporation moved their data warehousing solution to AWS. Additionally, the firm has an AWS Direct Connect connection. Through the use of a visualization tool, users in the corporate office may query the data warehouse. Each query answered by the data warehouse is on average 50 MB in size, whereas each webpage supplied by the visualization tool is around 500 KB in size. The data warehouse does not cache the result sets it returns.
Which approach results in the LOWEST OUTGOING DATA TRANSFER COSTS FOR THE COMPANY?
A. Host the visualization tool on premises and query the data warehouse directly over the internet.
B. Host the visualization tool in the same AWS Region as the data warehouse. Access it over the internet.
C. Host the visualization tool on premises and query the data warehouse directly over a Direct Connect connection at a location in the same AWS Region.
D. Host the visualization tool in the same AWS Region as the data warehouse and access it over a DirectConnect connection at a location in the same Region.
D. Host the visualization tool in the same AWS Region as the data warehouse and access it over a DirectConnect connection at a location in the same Region.
A business is developing an application that is composed of many microservices. The organization has chosen to deploy its software on AWS through container technology. The business need a solution that requires little ongoing work for maintenance and growth. Additional infrastructure cannot be managed by the business.
Which steps should a solutions architect perform in combination to satisfy these requirements? (Select two.)
A. Deploy an Amazon Elastic Container Service (Amazon ECS) cluster.
B. Deploy the Kubernetes control plane on Amazon EC2 instances that span multiple Availability Zones.
C. Deploy an Amazon Elastic Container Service (Amazon ECS) service with an Amazon EC2 launch type. Specify a desired task number level of greater than or equal to 2.
D. Deploy an Amazon Elastic Container Service (Amazon ECS) service with a Fargate launch type. Specify a desired task number level of greater than or equal to 2.
E. Deploy Kubernetes worker nodes on Amazon EC2 instances that span multiple Availability Zones. Create a deployment that specifies two or more replicas for each microservice.
A. Deploy an Amazon Elastic Container Service (Amazon ECS) cluster.
D. Deploy an Amazon Elastic Container Service (Amazon ECS) service with a Fargate launch type. Specify a desired task number level of greater than or equal to 2.
The following policy was developed by an Amazon EC2 administrator and assigned to an IAM group including numerous users: { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "ec2:TerminateInstances", "Resource": "*", "Condition": { "IpAddress": { "aws:SourceIp": "10.100.100.0/24" } }
}, { "Effect": "Deny", "Action": "ec2:*", "Resource": "*", "Condition": { "StringNotEquals": { "ec2:Region": "us-east-1" } } } ] }
What impact does this policy have?
A. Users can terminate an EC2 instance in any AWS Region except us-east-1.
B. Users can terminate an EC2 instance with the IP address 10.100.100.1 in the us-east-1 Region.
C. Users can terminate an EC2 instance in the us-east-1 Region when the user’s source IP is 10.100.100.254.
D. Users cannot terminate an EC2 instance in the us-east-1 Region when the user’s source IP is 10.100.100.254.
C. Users can terminate an EC2 instance in the us-east-1 Region when the user’s source IP is 10.100.100.254.
The web application of a business stores its data on an Amazon RDS PostgreSQL database instance. Accountants conduct massive queries at the start of each month during the financial closure period, which has a negative influence on the database’s performance owing to excessive utilization. The business want to reduce the effect of reporting on the online application.
What should a solutions architect do to minimize the database’s influence with the LEAST amount of work possible?
A. Create a read replica and direct reporting traffic to the replica.
B. Create a Multi-AZ database and direct reporting traffic to the standby.
C. Create a cross-Region read replica and direct reporting traffic to the replica.
D. Create an Amazon Redshift database and direct reporting traffic to the Amazon Redshift database.
A. Create a read replica and direct reporting traffic to the replica.
A business has implemented a MySQL database on Amazon RDS. The database support team is reporting delayed reads on the DB instance as a result of the increased transactions and advises installing a read replica.
Which activities should a solutions architect do prior to deploying this change? (Select two.)
A. Enable binlog replication on the RDS primary node.
B. Choose a failover priority for the source DB instance.
C. Allow long-running transactions to complete on the source DB instance.
D. Create a global table and specify the AWS Regions where the table will be available.
E. Enable automatic backups on the source instance by setting the backup retention period to a value other than 0.
C. Allow long-running transactions to complete on the source DB instance.
E. Enable automatic backups on the source instance by setting the backup retention period to a value other than 0.
Users may get past performance reports from a company’s website. The website requires a solution that can grow to suit the company’s worldwide website requirements. The solution should be cost-effective, minimize infrastructure resource provisioning, and deliver the quickest reaction time feasible.
Which mix of technologies might a solutions architect propose in order to satisfy these requirements?
A. Amazon CloudFront and Amazon S3
B. AWS Lambda and Amazon DynamoDB
C. Application Load Balancer with Amazon EC2 Auto Scaling
D. Amazon Route 53 with internal Application Load Balancers
A. Amazon CloudFront and Amazon S3
A solutions architect is developing a hybrid application on the Amazon Web Services (AWS) cloud. AWS Direct Link (DX) will be used to connect the on-premises data center to AWS. Between AWS and the on-premises data center, the application connection must be very durable.
Which DX setup should be used to satisfy these criteria?
A. Configure a DX connection with a VPN on top of it.
B. Configure DX connections at multiple DX locations.
C. Configure a DX connection using the most reliable DX partner.
D. Configure multiple virtual interfaces on top of a DX connection.
B. Configure DX connections at multiple DX locations.
A financial institution uses AWS to host a web application. The program retrieves current stock prices using an Amazon API Gateway Regional API endpoint. The security staff at the organization has detected an upsurge in API queries. The security team is worried that HTTP flood attacks may result in the application being rendered inoperable.
A solutions architect must create a defense against this form of assault.
Which method satisfies these criteria with the LEAST amount of operational overhead?
A. Create an Amazon CloudFront distribution in front of the API Gateway Regional API endpoint with a maximum TTL of 24 hours.
B. Create a Regional AWS WAF web ACL with a rate-based rule. Associate the web ACL with the API Gateway stage.
C. Use Amazon CloudWatch metrics to monitor the Count metric and alert the security team when the predefined rate is reached.
D. Create an Amazon CloudFront distribution with Lambda@Edge in front of the API Gateway Regional API endpoint. Create an AWS Lambda function to block requests from IP addresses that exceed the predefined rate.
B. Create a Regional AWS WAF web ACL with a rate-based rule. Associate the web ACL with the API Gateway stage.
A business wishes to automate the evaluation of the security of its Amazon EC2 instances. The organization must verify and show that the development process adheres to security and compliance requirements.
What actions should a solutions architect take to ensure that these criteria are met?
A. Use Amazon Macie to automatically discover, classify and protect the EC2 instances.
B. Use Amazon GuardDuty to publish Amazon Simple Notification Service (Amazon SNS) notifications.
C. Use Amazon Inspector with Amazon CloudWatch to publish Amazon Simple Notification Service (Amazon SNS) notifications
D. Use Amazon EventBridge (Amazon CloudWatch Events) to detect and react to changes in the status of AWS Trusted Advisor checks.
C. Use Amazon Inspector with Amazon CloudWatch to publish Amazon Simple Notification Service (Amazon SNS) notifications
On Amazon EC2, a corporation is operating a highly secure application that is backed up by an Amazon RDS database. All personally identifiable information (PII) must be encrypted at rest to comply with compliance standards.
Which solution should a solutions architect propose in order to achieve this need with the MINIMUM number of infrastructure changes?
A. Deploy AWS Certificate Manager to generate certificates. Use the certificates to encrypt the database volume.
B. Deploy AWS CloudHSM, generate encryption keys, and use the customer master key (CMK) to encrypt database volumes.
C. Configure SSL encryption using AWS Key Management Service customer master keys (AWS KMS CMKs) to encrypt database volumes.
D. Configure Amazon Elastic Block Store (Amazon EBS) encryption and Amazon RDS encryption with AWS Key Management Service (AWS KMS) keys to encrypt instance and database volumes.
D. Configure Amazon Elastic Block Store (Amazon EBS) encryption and Amazon RDS encryption with AWS Key Management Service (AWS KMS) keys to encrypt instance and database volumes.
The following IAM policy has been established by a solutions architect: { "Version": 2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "lambda:*" ], "Resource": "*" }, { "Effect": "Deny", "Action": [ "lambda:CreateFunction", "lambda:DeleteFunction" ], "Resource": "*" "Condition": { "IpAddress": { "aws:SourceIp": "220.100.16.0/20" } } } ] }
Which actions will the policy permit?
A. An AWS Lambda function can be deleted from any network.
B. An AWS Lambda function can be created from any network.
C. An AWS Lambda function can be deleted from the 100.220.0.0/20 network.
D. An AWS Lambda function can be deleted from the 220.100.16.0/20 network.
C. An AWS Lambda function can be deleted from the 100.220.0.0/20 network.
On Amazon Aurora, a business is operating a database. Every nightfall, the database is inactive. When user traffic surges in the early hours, an application that makes large reads on the database will face performance concerns. When reading from the database during these peak hours, the program encounters timeout issues. Due to the lack of a dedicated operations crew, the organization need an automated solution to solve performance concerns.
Which activities should a solutions architect take to ensure that the database automatically adjusts to the increasing read load? (Select two.)
A. Migrate the database to Aurora Serverless.
B. Increase the instance size of the Aurora database.
C. Configure Aurora Auto Scaling with Aurora Replicas.
D. Migrate the database to an Aurora multi-master cluster.
E. Migrate the database to an Amazon RDS for MySQL Multi-AZ deployment.
A. Migrate the database to Aurora Serverless.
C. Configure Aurora Auto Scaling with Aurora Replicas.
An Amazon EC2 instance-based application requires access to an Amazon DynamoDB database. The EC2 instance and DynamoDB table are both managed by the same AWS account. Permissions must be configured by a solutions architect.
Which approach will provide the EC2 instance least privilege access to the DynamoDB table?
A. Create an IAM role with the appropriate policy to allow access to the DynamoDB table. Create an instance profile to assign this IAM role to the EC2 instance.
B. Create an IAM role with the appropriate policy to allow access to the DynamoDB table. Add the EC2 instance to the trust relationship policy document to allow it to assume the role.
C. Create an IAM user with the appropriate policy to allow access to the DynamoDB table. Store the credentials in an Amazon S3 bucket and read them from within the application code directly.
D. Create an IAM user with the appropriate policy to allow access to the DynamoDB table. Ensure that the application stores the IAM credentials securely on local storage and uses them to make the DynamoDB calls.
A. Create an IAM role with the appropriate policy to allow access to the DynamoDB table. Create an instance profile to assign this IAM role to the EC2 instance.
A business uses Amazon EC2 instances to operate an API-based inventory reporting application. The program makes use of an Amazon DynamoDB database to store data. The distribution centers of the corporation use an on-premises shipping application that communicates with an API to update inventory prior to generating shipping labels. Each day, the organization has seen application outages, resulting in missed transactions.
What should a solutions architect propose to increase the resilience of an application?
A. Modify the shipping application to write to a local database.
B. Modify the application APIs to run serverless using AWS Lambda
C. Configure Amazon API Gateway to call the EC2 inventory application APIs.
D. Modify the application to send inventory updates using Amazon Simple Queue Service (Amazon SQS).
D. Modify the application to send inventory updates using Amazon Simple Queue Service (Amazon SQS).
Amazon EC2 instances on private subnets are used to execute an application. The application requires access to a table in Amazon DynamoDB.
What is the MOST SECURE method of accessing the table without allowing traffic to exit the AWS network?
A. Use a VPC endpoint for DynamoDB.
B. Use a NAT gateway in a public subnet.
C. Use a NAT instance in a private subnet.
D. Use the internet gateway attached to the VPC.
A. Use a VPC endpoint for DynamoDB.
On a single Amazon EC2 instance, a business runs an ASP.NET MVC application. Due to a recent spike in application usage, users are experiencing poor response times during lunch hours. The firm must address this issue using the least amount of settings possible.
What recommendations should a solutions architect make to satisfy these requirements?
A. Move the application to AWS Elastic Beanstalk. Configure load-based auto scaling and time-based scaling to handle scaling during lunch hours.
B. Move the application to Amazon Elastic Container Service (Amazon ECS). Create an AWS Lambda function to handle scaling during lunch hours.
C. Move the application to Amazon Elastic Container Service (Amazon ECS). Configure scheduled scaling for AWS Application Auto Scaling during lunch hours.
D. Move the application to AWS Elastic Beanstalk. Configure load-based auto scaling, and create an AWS Lambda function to handle scaling during lunch hours.
A. Move the application to AWS Elastic Beanstalk. Configure load-based auto scaling and time-based scaling to handle scaling during lunch hours.
A business is in the process of migrating its on-premises application to AWS. Program servers and a Microsoft SQL Server database comprise the application. The database cannot be transferred to another engine due to the application’s NET code using SQL Server functionality. The company’s goal is to maximize availability while decreasing operational and administration costs.
What actions should a solutions architect take to achieve this?
A. Install SQL Server on Amazon EC2 in a Multi-AZ deployment.
B. Migrate the data to Amazon RDS for SQL Server in a Multi-AZ deployment.
C. Deploy the database on Amazon RDS for SQL Server with Multi-AZ Replicas.
D. Migrate the data to Amazon RDS for SQL Server in a cross-Region Multi-AZ deployment.
B. Migrate the data to Amazon RDS for SQL Server in a Multi-AZ deployment.
Amazon Redshift is being used by a business to do analytics and produce customer reports. The corporation just obtained an extra 50 terabytes of demographic data on its customers. The data is saved in Amazon S3 in.csv files. The organization need a system that efficiently merges data and visualizes the findings.
What recommendations should a solutions architect make to satisfy these requirements?
A. Use Amazon Redshift Spectrum to query the data in Amazon S3 directly and join that data with the existing data in Amazon Redshift. Use Amazon QuickSight to build the visualizations.
B. Use Amazon Athena to query the data in Amazon S3. Use Amazon QuickSight to join the data from Athena with the existing data in Amazon Redshift and to build the visualizations.
C. Increase the size of the Amazon Redshift cluster, and load the data from Amazon S3. Use Amazon EMR Notebooks to query the data and build the visualizations in Amazon Redshift.
D. Export the data from the Amazon Redshift cluster into Apache Parquet files in Amazon S3. Use Amazon Elasticsearch Service (Amazon ES) to query the data. Use Kibana to visualize the results.
A. Use Amazon Redshift Spectrum to query the data in Amazon S3 directly and join that data with the existing data in Amazon Redshift. Use Amazon QuickSight to build the visualizations.