SSA Flashcards
A hospital has a mission-critical application that uses a RESTful API powered by Amazon API Gateway and AWS Lambda. The medical officers upload PDF reports to the system which are then stored as static media content in an Amazon S3 bucket.
The security team wants to improve its visibility when it comes to cyber-attacks and ensure HIPAA (Health Insurance Portability and Accountability Act) compliance. The company is searching for a solution that continuously monitors object-level S3 API operations and identifies protected health information (PHI) in the reports, with minimal changes in their existing Lambda function.
Which of the following solutions will meet these requirements with the LEAST operational overhead?
Use Amazon Textract Medical with PII redaction turned on to extract and filter sensitive text from the PDF reports. Create a new Lambda function that calls the regular Amazon Comprehend API to identify the PHI from the extracted text.
Use Amazon Textract to extract the text from the PDF reports. Integrate Amazon Comprehend Medical with the existing Lambda function to identify the PHI from the extracted text.
Use Amazon Transcribe to read and analyze the PDF reports using the StartTranscriptionJob API operation.
Use Amazon SageMaker Ground Truth to label and detect protected health information (PHI) content with low-confidence predictions.
Use Amazon Rekognition to extract the text data from the PDF reports. Integrate the Amazon Comprehend Medical service with the existing Lambda functions to identify the PHI from the extracted text.
Use Amazon Textract to extract the text from the PDF reports. Integrate Amazon Comprehend Medical with the existing Lambda function to identify the PHI from the extracted text.
A company has a web-based order processing system that is currently using a standard queue in Amazon SQS. The IT Manager noticed that there are a lot of cases where an order was processed twice. This issue has caused a lot of trouble in processing and made the customers very unhappy. The manager has asked you to ensure that this issue will not recur.
What can you do to prevent this from happening again in the future? (Select TWO.)
Change the message size in SQS.
Alter the visibility timeout of SQS.
Alter the retention period in Amazon SQS.
Replace Amazon SQS and instead, use Amazon Simple Workflow service.
Use an Amazon SQS FIFO Queue instead.
Replace Amazon SQS and instead, use Amazon Simple Workflow service.
Use an Amazon SQS FIFO Queue instead.
A company launched an EC2 instance in the newly created VPC. They noticed that the generated instance does not have an associated DNS hostname.
Which of the following options could be a valid reason for this issue?
The newly created VPC has an invalid CIDR block.
Amazon Route 53 is not enabled.
The DNS resolution and DNS hostname of the VPC configuration should be enabled.
The security group of the EC2 instance needs to be modified.
The DNS resolution and DNS hostname of the VPC configuration should be enabled.
To save costs, your manager instructed you to analyze and review the setup of your AWS cloud infrastructure. You should also provide an estimate of how much your company will pay for all of the AWS resources that they are using.
In this scenario, which of the following will incur costs? (Select TWO.)
A running EC2 Instance
A stopped On-Demand EC2 Instance
Public Data Set
Using an Amazon VPC
EBS Volumes attached to stopped EC2 Instances
A running EC2 Instance
EBS Volumes attached to stopped EC2 Instances
A tech company currently has an on-premises infrastructure. They are currently running low on storage and want to have the ability to extend their storage using the AWS cloud.
Which AWS service can help them achieve this requirement?
Amazon Storage Gateway
Amazon EC2
Amazon SQS
Amazon Elastic Block Storage
Amazon Storage Gateway
What is Amazon Storage Gateway?
AWS storage gateway connects an on-premises software appliance with cloud-based storage to provide seamless integration with data security features between on-premises ENV and AWS storage infra
A company has a set of Linux servers running on multiple On-Demand EC2 Instances. The Audit team wants to collect and process the application log files generated from these servers for their report.
Which of the following services is best to use in this case?
A single On-Demand Amazon EC2 instance for both storing and processing the log files
Amazon S3 Glacier for storing the application log files and Spot EC2 Instances for processing them.
Amazon S3 Glacier Deep Archive for storing the application log files and AWS ParallelCluster for processing the log files.
Amazon S3 for storing the application log files and Amazon Elastic MapReduce for processing the log files.
Amazon S3 for storing the application log files and Amazon Elastic MapReduce for processing the log files.
A company is using an Auto Scaling group which is configured to launch new t2.micro EC2 instances when there is a significant load increase in the application. To cope with the demand, you now need to replace those instances with a larger t2.2xlarge instance type.
How would you implement this change?
Change the instance type of each EC2 instance manually.
Create a new version of the launch template with the new instance type and update the Auto Scaling Group.
Create another Auto Scaling Group and attach the new instance type.
Just change the instance type to t2.2xlarge in the current launch template.
Create a new version of the launch template with the new instance type and update the Auto Scaling Group.
A media company needs to configure an Amazon S3 bucket to serve static assets for the public-facing web application. Which methods ensure that all of the objects uploaded to the S3 bucket can be read publicly all over the Internet? (Select TWO.)
Create an IAM role to set the objects inside the S3 bucket to public read.
Configure the S3 bucket policy to set all objects to public read.
Configure the cross-origin resource sharing (CORS) of the S3 bucket to allow objects to be publicly accessible from all domains.
Do nothing. Amazon S3 objects are already public by default.
Grant public read access to the object when uploading it using the S3 Console.
A media company needs to configure an Amazon S3 bucket to serve static assets for the public-facing web application. Which methods ensure that all of the objects uploaded to the S3 bucket can be read publicly all over the Internet? (Select TWO.)
Create an IAM role to set the objects inside the S3 bucket to public read.
Configure the S3 bucket policy to set all objects to public read.
Configure the cross-origin resource sharing (CORS) of the S3 bucket to allow objects to be publicly accessible from all domains.
Do nothing. Amazon S3 objects are already public by default.
Grant public read access to the object when uploading it using the S3 Console.
Configure the S3 bucket policy to set all objects to public read.
Grant public read access to the object when uploading it using the S3 Console.
A company has hundreds of VPCs with multiple VPN connections to their data centers spanning 5 AWS Regions. As the number of its workloads grows, the company must be able to scale its networks across multiple accounts and VPCs to keep up. A Solutions Architect is tasked to interconnect all of the company’s on-premises networks, VPNs, and VPCs into a single gateway, which includes support for inter-region peering across multiple AWS regions.
Which of the following is the BEST solution that the architect should set up to support the required interconnectivity?
Set up an AWS Transit Gateway in each region to interconnect all networks within it. Then, route traffic between the transit gateways through a peering connection.
Set up an AWS Direct Connect Gateway to achieve inter-region VPC access to all of the AWS resources and on-premises data centers. Set up a link aggregation group (LAG) to aggregate multiple connections at a single AWS Direct Connect endpoint in order to treat them as a single, managed connection. Launch a virtual private gateway in each VPC and then create a public virtual interface for each AWS Direct Connect connection to the Direct Connect Gateway.
Set up an AWS VPN CloudHub for inter-region VPC access and a Direct Connect gateway for the VPN connections to the on-premises data centers. Create a virtual private gateway in each VPC, then create a private virtual interface for each AWS Direct Connect connection to the Direct Connect gateway.
Enable inter-region VPC peering that allows peering relationships to be established between multiple VPCs across different AWS regions. Set up a networking configuration that ensures that the traffic will always stay on the global AWS backbone and never traverse the public Internet.
Set up an AWS Transit Gateway in each region to interconnect all networks within it. Then, route traffic between the transit gateways through a peering connection.
A leading IT consulting company has an application which processes a large stream of financial data by an Amazon ECS Cluster then stores the result to a DynamoDB table. You have to design a solution to detect new entries in the DynamoDB table then automatically trigger a Lambda function to run some tests to verify the processed data.
What solution can be easily implemented to alert the Lambda function of new entries while requiring minimal configuration change to your architecture?
Invoke the Lambda functions using SNS each time that the ECS Cluster successfully processed financial data.
Use Systems Manager Automation to detect new entries in the DynamoDB table then automatically invoke the Lambda function for processing.
Use CloudWatch Alarms to trigger the Lambda function whenever a new entry is created in the DynamoDB table.
Enable DynamoDB Streams to capture table activity and automatically trigger the Lambda function.
Enable DynamoDB streams to capture table activity and automatically trigger the lambda function
A company is using an On-Demand EC2 instance to host a legacy web application that uses an Amazon Instance Store-Backed AMI. The web application should be decommissioned as soon as possible and hence, you need to terminate the EC2 instance.
When the instance is terminated, what happens to the data on the root volume?
Data is automatically saved as an EBS snapshot.
Data is automatically saved as an EBS volume.
Data is automatically deleted.
Data is unavailable until the instance is restarted.
Data is automatically deleted.
A company conducts performance testing on a t3.large MySQL RDS DB instance twice a week. They use Performance Insights to analyze and fine-tune expensive queries. The company needs to reduce its operational expense in running the tests without compromising the tests’ integrity.
Which of the following is the most cost-effective solution?
Once the testing is completed, take a snapshot of the database and terminate it. Restore the database from the snapshot when necessary.
Stop the database once the test is done and restart it only when necessary.
Perform a mysqldump to get a copy of the database on a local machine. Use MySQL Workbench to analyze the queries.
Downgrade the database instance to t3.small.
Once the testing is completed, take a snapshot of the database and terminate it. Restore the database from the snapshot when necessary.
A popular augmented reality (AR) mobile game is heavily using a RESTful API which is hosted in AWS. The API uses Amazon API Gateway and a DynamoDB table with a preconfigured read and write capacity. Based on your systems monitoring, the DynamoDB table begins to throttle requests during high peak loads which causes the slow performance of the game.
Which of the following can you do to improve the performance of your app?
Create an SQS queue in front of the DynamoDB table.
Integrate an Application Load Balancer with your DynamoDB table.
Add the DynamoDB table to an Auto Scaling Group.
Use DynamoDB Auto Scaling
Use DynamoDB auto scaling
A company decided to change its third-party data analytics tool to a cheaper solution. They sent a full data export on a CSV file which contains all of their analytics information. You then save the CSV file to an S3 bucket for storage. Your manager asked you to do some validation on the provided data export.
In this scenario, what is the most cost-effective and easiest way to analyze export data using standard SQL?
Create a migration tool to load the CSV export file from S3 to a DynamoDB instance. Once the data has been loaded, run queries using DynamoDB.
Use mysqldump client utility to load the CSV export file from S3 to a MySQL RDS instance. Run some SQL queries once the data has been loaded to complete your validation.
To be able to run SQL queries, use Amazon Athena to analyze the export data file in S3.
Use a migration tool to load the CSV export file from S3 to a database that is designed for online analytic processing (OLAP) such as AWS RedShift. Run some queries once the data has been loaded to complete your validation.
To be able to run SQL queries, use Amazon Athena to analyze the export data file in S3.
A company has a global news website hosted in a fleet of EC2 Instances. Lately, the load on the website has increased which resulted in slower response time for the site visitors. This issue impacts the revenue of the company as some readers tend to leave the site if it does not load after 10 seconds.
Which of the below services in AWS can be used to solve this problem? (Select TWO.)
Use Amazon CloudFront with website as the custom origin.
For better read throughput, use AWS Storage Gateway to distribute the content across multiple regions.
Use Amazon ElastiCache for the website’s in-memory data store or cache.
Deploy the website to all regions in different VPCs for faster processing.
Use Amazon CloudFront with website as the custom origin.
Use Amazon ElastiCache for the website’s in-memory data store or cache.
A company needs to integrate the Lightweight Directory Access Protocol (LDAP) directory service from the on-premises data center to the AWS VPC using IAM. The identity store which is currently being used is not compatible with SAML.
Which of the following provides the most valid approach to implement the integration?
Develop an on-premises custom identity broker application and use STS to issue short-lived AWS credentials.
Use AWS Single Sign-On (SSO) service to enable single sign-on between AWS and your LDAP.
Use an IAM policy that references the LDAP identifiers and AWS credentials.
Use IAM roles to rotate the IAM credentials whenever LDAP credentials are updated.
Develop an on-premises custom identity broker application and use STS to issue short-lived AWS credentials.
A startup is planning to set up and govern a secure, compliant, multi-account AWS environment in preparation for its upcoming projects. The IT Manager requires the solution to have a dashboard for continuous detection of policy non-conformance and non-compliant resources across the enterprise, as well as to comply with the AWS multi-account strategy best practices.
Which of the following offers the easiest way to fulfill this task?
Use AWS Organizations to build a landing zone to automatically provision new AWS accounts. Utilize the AWS Personal Health Dashboard to see provisioned accounts across your enterprise. Enable preventive and detective guardrails enabled for policy enforcement.
Launch new AWS member accounts using the AWS CloudFormation StackSets. Use AWS Config to continuously track the configuration changes and set rules to monitor non-compliant resources. Set up a Multi-Account Multi-Region Data Aggregator to monitor compliance data for rules and accounts in an aggregated view
Use AWS Service Catalog to launch new AWS member accounts. Configure AWS Service Catalog Launch Constraints to continuously track configuration changes and monitor non-compliant resources. Set up a Multi-Account Multi-Region Data Aggregator to monitor compliance data for rules and accounts in an aggregated view
Use AWS Control Tower to launch a landing zone to automatically provision and configure new accounts through an Account Factory. Utilize the AWS Control Tower dashboard to monitor provisioned accounts across your enterprise. Set up preventive and detective guardrails for policy enforcement.
An organization plans to use an AWS Direct Connect connection to establish a dedicated connection between its on-premises network and AWS. The organization needs to launch a fully managed solution that will automate and accelerate the replication of data to and from various AWS storage services.
Which of the following solutions would you recommend?
Use an AWS Storage Gateway tape gateway to store data on virtual tape cartridges and asynchronously copy your backups to AWS.
Use an AWS DataSync agent to rapidly move the data over the Internet.
Use an AWS DataSync agent to rapidly move the data over a service endpoint.
Use an AWS Storage Gateway file gateway to store and retrieve files directly using the SMB file system protocol.
Use an AWS DataSync agent to rapidly move the data over a service endpoint.
What is AWS DataSync?
Automate and accelerate the replication of data between your on-premises storage systems and AWS storage
A multinational bank is storing its confidential files in an S3 bucket. The security team recently performed an audit, and the report shows that multiple files have been uploaded without 256-bit Advanced Encryption Standard (AES) server-side encryption. For added protection, the encryption key must be automatically rotated every year. The solutions architect must ensure that there would be no other unencrypted files uploaded in the S3 bucket in the future.
Which of the following will meet these requirements with the LEAST operational overhead?
Create an S3 bucket policy that denies permissions to upload an object unless the request includes the s3:x-amz-server-side-encryption”: “AES256” header. Enable server-side encryption with Amazon S3-managed encryption keys (SSE-S3) and rely on the built-in key rotation feature of the SSE-S3 encryption keys.
Create a new customer-managed key (CMK) from the AWS Key Management Service (AWS KMS). Configure the default encryption behavior of the bucket to use the customer-managed key. Manually rotate the CMK each and every year.
Create an S3 bucket policy for the S3 bucket that rejects any object uploads unless the request includes the s3:x-amz-server-side-encryption”:”aws:kms” header. Enable the S3 Object Lock in compliance mode for all objects to automatically rotate the built-in AES256 customer-managed key of the bucket.
Create a Service Control Policy (SCP) for the S3 bucket that rejects any object uploads unless the request includes the s3:x-amz-server-side-encryption”: “AES256” header. Enable server-side encryption with Amazon S3-managed encryption keys (SSE-S3) and modify the built-in key rotation feature of the SSE-S3 encryption keys to rotate the key yearly.
Create an S3 bucket policy that denies permissions to upload an object unless the request includes the s3:x-amz-server-side-encryption”: “AES256” header. Enable server-side encryption with Amazon S3-managed encryption keys (SSE-S3) and rely on the built-in key rotation feature of the SSE-S3 encryption keys.
A company launched a global news website that is deployed to AWS and is using MySQL RDS. The website has millions of viewers from all over the world, which means that the website has a read-heavy database workload. All database transactions must be ACID compliant to ensure data integrity.
In this scenario, which of the following is the best option to use to increase the read-throughput on the MySQL database?
Use SQS to queue up the requests
Enable Multi-AZ deployments
Enable Amazon RDS Standby Replicas
Enable Amazon RDS Read Replicas
Enable RDS Read Replicas
A food company bought 50 licenses of Windows Server to be used by the developers when launching Amazon EC2 instances to deploy and test applications. The developers are free to provision EC2 instances as long as there is a license available. The licenses are tied to the total CPU count of each virtual machine. The company wants to ensure that developers won’t be able to launch new instances once the licenses are exhausted. The company wants to receive notifications when all licenses are in use.
Which of the following options is the recommended solution to meet the company’s requirements?
Configure AWS Resource Access Manager (AWS RAM) to track and control the licenses used by AWS resources. Configure AWS RAM to provide available licenses for Amazon EC2 instances. Set up an Amazon SNS to send notifications and alerts once all licenses are used.
Upload the licenses on AWS Systems Manager Fleet Manager to be encrypted and distributed to Amazon EC2 instances. Attach an IAM role on the EC2 instances to request a license from the Fleet Manager. Set up an Amazon SNS to send notifications and alerts once all licenses are used
Define license configuration rules on AWS Certificate Manager to track and control license usage. Enable the option to “Enforce certificate limit” to prevent going over the number of allocated licenses. Add an Amazon SQS queue with ChangeVisibility Timeout configured to send notifications and alerts.
Define licensing rules on AWS License Manager to track and control license usage. Enable the option to “Enforce license limit” to prevent going over the number of allocated licenses. Add an Amazon SNS topic to send notifications and alerts.
Define licensing rules on AWS License Manager to track and control license usage. Enable the option to “Enforce license limit” to prevent going over the number of allocated licenses. Add an Amazon SNS topic to send notifications and alerts.