AWS Cloud Practitioner Practice Exam Questions Flashcards
Which AWS Service can be used to mitigate a Distributed Denial of Service (DDoS) attack?
a. Amazon CloudWatch
b. AWS Systems Manager
c. AWS Shield
d. AWS Key Management Service (AWS KMS)
c. AWS Shield
AWS Shield is a managed Distributed Denial of Service (DDoS) protection service that safeguards applications running on AWS. AWS Shield provides always-on detection and automatic inline mitigations that minimize application downtime and latency, so there is no need to engage AWS Support to benefit from DDoS protection. There are two tiers of AWS Shield - Standard and Advanced.
All AWS customers benefit from the automatic protections of AWS Shield Standard, at no additional charge. AWS Shield Standard defends against most common, frequently occurring network and transport layer DDoS attacks that target your web site or applications. When you use AWS Shield Standard with Amazon CloudFront and Amazon Route 53, you receive comprehensive availability protection against all known infrastructure (Layer 3 and 4) attacks.
For higher levels of protection against attacks targeting your applications running on Amazon Elastic Compute Cloud (EC2), Elastic Load Balancing (ELB), Amazon CloudFront, AWS Global Accelerator and Amazon Route 53 resources, you can subscribe to AWS Shield Advanced. In addition to the network and transport layer protections that come with Standard, AWS Shield Advanced provides additional detection and mitigation against large and sophisticated DDoS attacks, near real-time visibility into attacks, and integration with AWS WAF, a web application firewall.
Which of the following statements are true about AWS Shared Responsibility Model? (Select two)
a. AWS maintains the configuration of its infrastructure devices and is responsible for configuring the guest operating systems, databases, and applications
b. Amazon Elastic Compute Cloud (Amazon EC2) is categorized as Infrastructure as a Service (IaaS) and hence AWS will perform all of the necessary security configuration and management tasks
c. AWS trains AWS employees, but a customer must train their own employees
d. For abstracted services, such as Amazon S3, AWS operates the infrastructure layer, the operating system, platforms, encryption options, and appropriate permissions for accessing the S3 resources
e. AWS is responsible for patching and fixing flaws within the infrastructure, but customers are responsible for patching their guest operating system and applications
c. AWS trains AWS employees, but a customer must train their own employees
e. AWS is responsible for patching and fixing flaws within the infrastructure, but customers are responsible for patching their guest operating system and applications
“Security of the Cloud” is the responsibility of AWS - AWS is responsible for protecting the infrastructure that runs all of the services offered in the AWS Cloud. This infrastructure is composed of the hardware, software, networking, and facilities that run AWS Cloud services. As part of Patch Management, a Shared Control responsibility of AWS Shared Responsibility Model, AWS is responsible for patching and fixing flaws within the infrastructure, but customers are responsible for patching their guest OS and applications.
“Security in the Cloud” is the responsibility of the customer. Customer responsibility will be determined by the AWS Cloud services that a customer selects. This determines the amount of configuration work the customer must perform as part of their security responsibilities.
As part of Awareness & Training, a Shared Control responsibility of the AWS Shared Responsibility Model, AWS trains AWS employees, but a customer must train their own employees.
AWS Shared Responsibility Model: https://aws.amazon.com/compliance/shared-responsibility-model/
Incorrect options:
AWS maintains the configuration of its infrastructure devices and is responsible for configuring the guest operating systems, databases, and applications - As part of Configuration Management, a Shared Control responsibility of the AWS Shared Responsibility Model, AWS maintains the configuration of its infrastructure devices, but a customer is responsible for configuring their own guest operating systems, databases, and applications.
Amazon Elastic Compute Cloud (Amazon EC2) is categorized as Infrastructure as a Service (IaaS) and hence AWS will perform all of the necessary security configuration and management tasks - A service such as Amazon Elastic Compute Cloud (Amazon EC2) is categorized as Infrastructure as a Service (IaaS) and, as such, requires the customer to perform all of the necessary security configuration and management tasks. Customers that deploy an Amazon EC2 instance are responsible for the management of the guest operating system (including updates and security patches), any application software or utilities installed by the customer on the instances, and the configuration of the AWS-provided firewall (called a security group) on each instance.
For abstracted services, such as Amazon S3, AWS operates the infrastructure layer, the operating system, platforms, encryption options, and appropriate permissions for accessing the S3 resources - For abstracted services, such as Amazon S3 and Amazon DynamoDB, AWS operates the infrastructure layer, the operating system, and platforms, and customers access the endpoints to store and retrieve data. Customers are responsible for managing their data (including encryption options), classifying their assets, and using IAM tools to apply the appropriate permissions.
Which of the following AWS authentication mechanisms supports an AWS Multi-Factor Authentication (AWS MFA) device that you can plug into a USB port on your computer?
a. SMS text message-based Multi-Factor Authentication (AWS MFA)
b. Hardware Multi-Factor Authentication (AWS MFA) device
c. U2F security key
d. Virtual Multi-Factor Authentication (AWS MFA) device
c. U2F security key
Universal 2nd Factor (U2F) Security Key is a device that you can plug into a USB port on your computer. U2F is an open authentication standard hosted by the FIDO Alliance. When you enable a U2F security key, you sign in by entering your credentials and then tapping the device instead of manually entering a code.
How to enable the U2F Security Key for your own IAM user: https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_mfa_enable_u2f.html
Incorrect options:
Virtual Multi-Factor Authentication (AWS MFA) device - This is a software app that runs on a phone or other device and emulates a physical device. The device generates a six-digit numeric code based upon a time-synchronized one-time password algorithm. The user must type a valid code from the device on a second webpage during sign-in. Each virtual MFA device assigned to a user must be unique.
Hardware Multi-Factor Authentication (AWS MFA) device - This is a hardware device that generates a six-digit numeric code based upon a time-synchronized one-time password algorithm. The user must type a valid code from the device on a second webpage during sign-in. Each MFA device assigned to a user must be unique. A user cannot type a code from another user’s device to be authenticated.
SMS text message-based Multi-Factor Authentication (AWS MFA) - This is a type of MFA in which the IAM user settings include the phone number of the user’s SMS-compatible mobile device. When the user signs in, AWS sends a six-digit numeric code by SMS text message to the user’s mobile device. The user is required to type that code on a second webpage during sign-in.
A company would like to optimize Amazon Elastic Compute Cloud (Amazon EC2) costs. Which of the following actions can help with this task? (Select TWO)
a. Build its own servers
b. Vertically scale the EC2 instances
c. Set up Auto Scaling groups to align the number of instances with the demand
d. Purchase Amazon EC2 Reserved instances (RIs)
e. Opt for a higher AWS Support plan
c. Set up Auto Scaling groups to align the number of instances with the demand
d. Purchase Amazon EC2 Reserved instances (RIs)
An Auto Scaling group contains a collection of Amazon EC2 instances that are treated as a logical grouping for automatic scaling and management. You can adjust its size to meet demand, either manually or by using automatic scaling.
AWS Auto Scaling can help you optimize your utilization and cost efficiencies when consuming AWS services so you only pay for the resources you need.
How AWS Auto Scaling works: https://aws.amazon.com/autoscaling/
Amazon EC2 Reserved Instances (RI) provide a significant discount (up to 72%) compared to On-Demand pricing and provide a capacity reservation when used in a specific Availability Zone (AZ).
EC2 Pricing Options Overview: https://aws.amazon.com/ec2/pricing/
Incorrect options:
Vertically scale the EC2 instances - Vertically scaling EC2 instances (increasing one computer performance by adding CPUs, memory, and storage) is limited and is way more expensive than scaling horizontally (adding more computers to the system).
Opt for a higher AWS Support plan - The AWS Support plans do not help with EC2 costs.
Build its own servers - Building your own servers is more expensive than using EC2 instances in the cloud. You’re more likely to spend more money than saving money.
Which of the following is the correct statement regarding the AWS Storage services?
a. Amazon Simple Storage Service (Amazon S3) is file based storage, Amazon Elastic Block Store (Amazon EBS) is block based storage and Amazon Elastic File System (Amazon EFS) is object based storage
b. Amazon Simple Storage Service (Amazon S3) is block based storage, Amazon Elastic Block Store (Amazon EBS) is object based storage and Amazon Elastic File System (Amazon EFS) is file based storage
c. Amazon Simple Storage Service (Amazon S3) is object based storage, Amazon Elastic Block Store (Amazon EBS) is file based storage and Amazon Elastic File System (Amazon EFS) is block based storage
d. Amazon Simple Storage Service (Amazon S3) is object based storage, Amazon Elastic Block Store (Amazon EBS) is block based storage and Amazon Elastic File System (Amazon EFS) is file based storage
d. Amazon Simple Storage Service (Amazon S3) is object based storage, Amazon Elastic Block Store (Amazon EBS) is block based storage and Amazon Elastic File System (Amazon EFS) is file based storage
Amazon Elastic File System (Amazon EFS) provides a simple, scalable, fully managed, elastic NFS file system.
Amazon Elastic Block Store (Amazon EBS) is an easy to use, high-performance block storage service designed for use with Amazon Elastic Compute Cloud (Amazon EC2) for both throughput and transaction-intensive workloads at any scale.
Amazon Simple Storage Service (Amazon S3) is an object storage service that offers industry-leading scalability, data availability, security, and performance.
Which of the following is the least effort way to encrypt data for AWS services only in your AWS account using AWS Key Management Service (KMS)?
a. Use AWS KMS APIs to encrypt data within your own application by using the AWS Encryption SDK
b. Use AWS managed master keys that are automatically created in your account for each service
c. Use AWS owned CMK in the service you wish to use encryption
d. Create your own customer managed keys (CMKs) in AWS KMS
b. Use AWS managed master keys that are automatically created in your account for each service
AWS KMS keys (KMS keys) are the primary resource in AWS KMS. You can use a KMS key to encrypt, decrypt, and re-encrypt data. It can also generate data keys that you can use outside of AWS KMS. AWS KMS is replacing the term customer master key (CMK) with AWS KMS key and KMS key.
AWS managed CMKs are CMKs in your account that are created, managed, and used on your behalf by an AWS service that is integrated with AWS KMS. Some AWS services support only an AWS managed CMK. Others use an AWS owned CMK or offer you a choice of CMKs. AWS managed CMK can be used only for your AWS account.
You can view the AWS managed CMKs in your account, view their key policies, and audit their use in AWS CloudTrail logs. However, you cannot manage these CMKs, rotate them, or change their key policies. And, you cannot use AWS managed CMKs in cryptographic operations directly; the service that creates them uses them on your behalf.
AWS managed CMKs appear on the AWS managed keys page of the AWS Management Console for AWS KMS. You can also identify most AWS managed CMKs by their aliases, which have the format aws/service-name, such as aws/redshift.
You do not pay a monthly fee for AWS managed CMKs. They can be subject to fees for use in excess of the free tier, but some AWS services cover these costs for you.
Incorrect options:
Create your own customer managed keys (CMKs) in AWS KMS - The AWS KMS keys that you create are customer managed keys. Customer managed keys are KMS keys in your AWS account that you create, own, and manage. You have full control over these KMS keys, including establishing and maintaining their key policies, IAM policies, and grants, enabling and disabling them, rotating their cryptographic material, adding tags, creating aliases that refer to the KMS keys, and scheduling the KMS keys for deletion.
customer managed key (CMK) incur a monthly fee and a fee for use in excess of the free tier. They are counted against the AWS KMS quotas for your account.
Use AWS KMS APIs to encrypt data within your own application by using the AWS Encryption SDK - AWS KMS APIs can also be accessed directly through the AWS KMS Command Line Interface or AWS SDK for programmatic access. AWS KMS APIs can also be used indirectly to encrypt data within your own applications by using the AWS Encryption SDK. This requires code changes and is not the easiest way to achieve encryption.
Use AWS owned CMK in the service you wish to use encryption - AWS owned CMKs are a collection of CMKs that an AWS service owns and manages for use in multiple AWS accounts. Although AWS owned CMKs are not in your AWS account, an AWS service can use its AWS owned CMKs to protect the resources in your account. AWS owned CMK can be used for multiple AWS accounts.
You do not need to create or manage the AWS owned CMKs. However, you cannot view, use, track, or audit them. You are not charged a monthly fee or usage fee for AWS owned CMKs and they do not count against the AWS KMS quotas for your account.
A company wants to have control over creating and using its own keys for encryption on AWS services. Which of the following can be used for this use-case?
a. customer managed key (CMK)
b. AWS Secrets Manager
c. AWS owned key
d. AWS managed key
a. customer managed key (CMK)
An AWS KMS key is a logical representation of a cryptographic key. A KMS key contains metadata, such as the key ID, key spec, key usage, creation date, description, and key state. Most importantly, it contains a reference to the key material that is used when you perform cryptographic operations with the KMS key.
The KMS keys that you create are customer managed keys. Customer managed keys are KMS keys in your AWS account that you create, own, and manage. You have full control over these KMS keys, including establishing and maintaining their key policies, IAM policies, and grants, enabling and disabling them, rotating their cryptographic material, adding tags, creating aliases that refer to the KMS keys, and scheduling the KMS keys for deletion.
Which of the following AWS services have data encryption automatically enabled? (Select two)?
a. Amazon Elastic File System (Amazon EFS)
b. Amazon Redshift
c. Amazon Simple Storage Service (Amazon S3)
d. Amazon Elastic Block Store (Amazon EBS)
e. AWS Storage Gateway
c. Amazon Simple Storage Service (Amazon S3)
e. AWS Storage Gateway
All Amazon S3 buckets have encryption configured by default, and objects are automatically encrypted by using server-side encryption with Amazon S3 managed keys (SSE-S3). This encryption setting applies to all objects in your Amazon S3 buckets.
AWS Storage Gateway is a hybrid cloud storage service that gives you on-premises access to virtually unlimited cloud storage. All data transferred between the gateway and AWS storage is encrypted using SSL (for all three types of gateways - File, Volume and Tape Gateways).
Which of the following is available across all AWS Support plans?
a. Third-Party Software Support
b. Enhanced Technical Support with unlimited cases and unlimited contacts
c. Full set of AWS Trusted Advisor best practice checks
d. AWS Health Dashboard – Your account health
d. AWS Health Dashboard – Your account health
Full set of AWS Trusted Advisor best practice checks, enhanced Technical Support with unlimited cases, and unlimited contacts and third-party Software Support are available only for Business and Enterprise Support plans.
AWS Health Dashboard – Your account health is available for all Support plans.
AWS Health Dashboard – Your account health provides alerts and remediation guidance when AWS is experiencing events that may impact you.
With AWS Health Dashboard – Your account health, alerts are triggered by changes in the health of your AWS resources, giving you event visibility, and guidance to help quickly diagnose and resolve issues.
You can check on this page https://health.aws.amazon.com/health/home to get current status information.
Exam Alert:
Please review the differences between the AWS Developer Support, AWS Business Support, AWS Enterprise On-Ramp Support and AWS Enterprise Support plans as you can expect at least a couple of questions on the exam: https://aws.amazon.com/premiumsupport/plans/
Which of the following options is NOT a feature of Amazon Inspector?
a. Analyze against unintended network accessibility
b. Inspect running operating systems (OS) against known vulnerabilities
c. Automate security assessments
d. Track configuration changes
d. Track configuration changes
Tracking configuration changes is a feature of AWS Config.
AWS Config is a service that enables you to assess, audit, and evaluate the configurations of your AWS resources. Config continuously monitors and records your AWS resource configurations and allows you to automate the evaluation of recorded configurations against desired configurations.
How AWS Config works: https://aws.amazon.com/config/
Incorrect options:
Automate security assessments
Analyze against unintended network accessibility
Inspect running operating systems (OS) against known vulnerabilities
These options are all features of Amazon Inspector.
Amazon Inspector is an automated security assessment service that helps improve the security and compliance of applications deployed on AWS. Amazon Inspector automatically assesses applications for exposure, vulnerabilities, and deviations from best practices.
Amazon Inspector security assessments help you check for unintended network accessibility of your Amazon EC2 instances and for vulnerabilities on those EC2 instances.
Amazon Inspector also offers predefined software called an agent that you can optionally install in the operating system of the EC2 instances that you want to assess. The agent monitors the behavior of the EC2 instances, including network, file system, and process activity. It also collects a wide set of behavior and configuration data (telemetry).
Which AWS service is used to store and commit code privately and also offer features for version control?
a. AWS CodeStar
b. AWS CodeBuild
c. AWS CodePipeline
d. AWS CodeCommit
d. AWS CodeCommit
AWS CodeCommit is a fully-managed source control service that hosts secure Git-based repositories. It makes it easy for teams to collaborate on code in a secure and highly scalable ecosystem. CodeCommit eliminates the need to operate your own source control system or worry about scaling its infrastructure. You can use CodeCommit to securely store anything from source code to binaries, and it works seamlessly with your existing Git tools.
AWS CodeCommit eliminates the need to host, maintain, back up, and scale your own source control servers. The service automatically scales to meet the growing needs of your project. AWS CodeCommit automatically encrypts your files in transit and at rest. AWS CodeCommit is integrated with AWS Identity and Access Management (AWS IAM) allowing you to customize user-specific access to your repositories.
AWS CodeCommit supports all Git commands and works with your existing Git tools. You can keep using your preferred development environment plugins, continuous integration/continuous delivery systems, and graphical clients with CodeCommit.
Incorrect options:
AWS CodePipeline - AWS CodePipeline is a fully managed continuous delivery service that helps you automate your release pipelines for fast and reliable application and infrastructure updates. CodePipeline automates the build, test, and deploy phases of your release process every time there is a code change, based on the release model you define. This enables you to rapidly and reliably deliver features and updates.
AWS CodeBuild - AWS CodeBuild is a fully managed continuous integration service that compiles source code, runs tests, and produces software packages that are ready to deploy. With CodeBuild, you don’t need to provision, manage, and scale your own build servers. CodeBuild scales continuously and processes multiple builds concurrently, so your builds are not left waiting in a queue. You can get started quickly by using prepackaged build environments, or you can create custom build environments that use your own build tools. With CodeBuild, you are charged by the minute for the compute resources you use.
AWS CodeStar - AWS CodeStar is a cloud‑based development service that provides the tools you need to quickly develop, build, and deploy applications on AWS. With AWS CodeStar, you can set up your entire continuous delivery toolchain in minutes, allowing you to start releasing code faster. AWS CodeStar makes it easy for your whole team to work together securely, with built-in role-based policies that allow you to easily manage access and add owners, contributors, and viewers to your projects.
Each CodeStar project includes development tools, including AWS CodePipeline, AWS CodeCommit, AWS CodeBuild and AWS CodeDeploy, that can be used on their own and with existing AWS applications.
Which of the following options are the benefits of using AWS Elastic Load Balancing (ELB)? (Select TWO)
a. Storage
b. High availability
c. Less costly
d. Fault tolerance
e. Agility
b. High availability
d. Fault tolerance
Elastic Load Balancing (ELB) automatically distributes incoming application traffic across multiple targets, such as Amazon EC2 instances, containers, and IP addresses. It can handle the varying load of your application traffic in a single Availability Zone (AZ) or across multiple Availability Zones (AZs).
Elastic Load Balancing (ELB) offers three types of load balancers that all feature the high availability, automatic scaling, and robust security necessary to make your applications fault-tolerant: Application Load Balancer (best suited for HTTP and HTTPS traffic), Network Load Balancer (best suited for TCP traffic), and Classic Load Balancer.
Incorrect options:
Agility - Agility refers to new IT resources being only a click away, which means that you reduce the time to make those resources available to your developers from weeks to just minutes. AWS Elastic Load Balancing (ELB) does not help with agility.
Less costly - AWS Elastic Load Balancing (ELB) does not help with reducing costs.
Storage - AWS Elastic Load Balancing (ELB) does not offer storage benefits. It is not a storage-related service.
A company needs to keep sensitive data in its own data center due to compliance but would still like to deploy resources using AWS. Which Cloud deployment model does this refer to?
a. Public Cloud
b. Hybrid Cloud
c. Private Cloud
d. On-premises
b. Hybrid Cloud
A hybrid deployment is a way to connect infrastructure and applications between cloud-based resources and existing resources that are not located in the cloud. The most common method of hybrid deployment is between the cloud and existing on-premises infrastructure to extend, and grow, an organization’s infrastructure into the cloud while connecting cloud resources to the internal system.
Overview of Cloud Computing Deployment Models: https://aws.amazon.com/types-of-cloud-computing/
Incorrect options:
Public Cloud - A public cloud-based application is fully deployed in the cloud and all parts of the application run in the cloud. Applications in the cloud have either been created in the cloud or have been migrated from an existing infrastructure to take advantage of the benefits of cloud computing.
Private Cloud - Unlike a Public cloud, a Private cloud enables businesses to avail IT services that are provisioned and customized according to their precise needs. The business can further avail the IT services securely and reliably over a private IT infrastructure.
On-premises - This is not a cloud deployment model. When an enterprise opts for on-premises, it needs to create, upgrade, and scale the on-premise IT infrastructure by investing in sophisticated hardware, compatible software, and robust services. Also, the business needs to deploy dedicated IT staff to upkeep, scale, and manage the on-premise infrastructure continuously.
A company has defined a baseline that mentions the number of AWS resources to be used for different stages of application testing. However, the company realized that employees are not adhering to the guidelines and provisioning additional resources via API calls, resulting in higher testing costs.
Which AWS service will help the company raise alarms whenever the baseline resource numbers are crossed?
a. Amazon Detective
b. AWS CloudTrail Insights
c. AWS X-Ray
d. AWS Config
b. AWS CloudTrail Insights
AWS CloudTrail Insights helps AWS users identify and respond to unusual activity associated with write API calls by continuously analyzing CloudTrail management events.
Insights events are logged when AWS CloudTrail detects unusual write management API activity in your account. If you have CloudTrail Insights enabled, and CloudTrail detects unusual activity, Insights events are delivered to the destination S3 bucket for your trail. You can also see the type of insight and the incident time period when you view Insights events on the CloudTrail console. Unlike other types of events captured in a CloudTrail trail, Insights events are logged only when CloudTrail detects changes in your account’s API usage that differ significantly from the account’s typical usage patterns.
AWS CloudTrail Insights can help you detect unusual API activity in your AWS account by raising Insights events. CloudTrail Insights measures your normal patterns of API call volume, also called the baseline, and generates Insights events when the volume is outside normal patterns.
AWS CloudTrail Insights continuously monitors CloudTrail write management events, and uses mathematical models to determine the normal levels of API and service event activity for an account. CloudTrail Insights identifies behavior that is outside normal patterns, generates Insights events, and delivers those events to a /CloudTrail-Insight folder in the chosen destination S3 bucket for your trail. You can also access and view Insights events in the AWS Management Console for CloudTrail.
Identify and Respond to Unusual API Activity using AWS CloudTrail Insights: https://aws.amazon.com/blogs/aws/announcing-cloudtrail-insights-identify-and-respond-to-unusual-api-activity/
Incorrect options:
AWS X-Ray - AWS X-Ray helps developers analyze and debug production, distributed applications, such as those built using a microservices architecture. With X-Ray, you can understand how your application and its underlying services are performing to identify and troubleshoot the root cause of performance issues and errors. X-Ray provides an end-to-end view of requests as they travel through your application, and shows a map of your application’s underlying components. X-Ray is not for tracking user actions when interacting with the AWS systems.
Amazon Detective - Amazon Detective simplifies the process of investigating security findings and identifying the root cause. Amazon Detective analyzes trillions of events from multiple data sources such as VPC Flow Logs, AWS CloudTrail logs, and Amazon GuardDuty findings and automatically creates a graph model that provides you with a unified, interactive view of your resources, users, and the interactions between them over time.
AWS Config - AWS Config is a service that enables you to assess, audit, and evaluate the configurations of your AWS resources. Config continuously monitors and records your AWS resource configurations and allows you to automate the evaluation of recorded configurations against desired configurations. With Config, you can review changes in configurations and relationships between AWS resources, dive into detailed resource configuration histories, and determine your overall compliance against the configurations specified in your internal guidelines. This enables you to simplify compliance auditing, security analysis, change management, and operational troubleshooting.
Which of the following are the best practices when using AWS Organizations? (Select TWO)
a. Restrict account privileges using Service Control Policies (SCP)
b. Never use tags for billing
c. Create AWS accounts per department
d. Disable AWS CloudTrail on several accounts
e. Do not use AWS Organizations to automate AWS account creation
a. Restrict account privileges using Service Control Policies (SCP)
c. Create AWS accounts per department
AWS Organizations helps you centrally govern your environment as you grow and scale your workloads on AWS. Whether you are a growing startup or a large enterprise, AWS Organizations help you to centrally manage billing; control access, compliance, and security; and share resources across your AWS accounts.
Using AWS Organizations, you can automate account creation, create groups of accounts to reflect your business needs, and apply policies for these groups for governance. You can also simplify billing by setting up a single payment method for all of your AWS accounts. Through integrations with other AWS services, you can use AWS Organizations to define central configurations and resource sharing across accounts in your organization. AWS Organizations is available to all AWS customers at no additional charge.
You should create accounts per department based on regulatory restrictions (using Service Control Policies (SCP)) for better resource isolation, and to have separate per-account service limits.
AWS Organizations allows you to restrict what services and actions are allowed in your accounts. You can use the Service Control Policies (SCP) to apply permission guardrails on AWS Identity and Access Management (IAM) users and roles.
Incorrect options:
Never use tags for billing - You should use tags standards to categorize AWS resources for billing purposes.
Disable AWS CloudTrail on several accounts - You should enable AWS CloudTrail to monitor activity on all accounts for governance, compliance, risk, and auditing purposes.
Do not use AWS Organizations to automate AWS account creation - AWS Organizations helps you simplify IT operations by automating AWS account creation and management. The AWS Organizations APIs enable you to create new accounts programmatically and to add new accounts to a group. The policies attached to the group are automatically applied to the new account.
Which of the following is the MOST cost-effective Amazon Elastic Compute Cloud (Amazon EC2) instance purchasing option for short-term, spiky and critical workloads on AWS Cloud?
a. Spot Instance
b. Dedicated Host
c. On-Demand Instance
d. Reserved Instance (RI)
c. On-Demand Instance
An On-Demand Instance is an instance that you use on-demand. You have full control over its lifecycle — you decide when to launch, stop, hibernate, start, reboot, or terminate it. There is no long-term commitment required when you purchase On-Demand Instances. There is no upfront payment and you pay only for the seconds that your On-Demand Instances are running. There is no need for a long-term purchasing commitment. The price per second for running an On-Demand Instance is fixed. On-demand instances cannot be interrupted. Therefore On-Demand instances are the best fit for short-term, spiky and critical workloads.
Amazon EC2 Pricing Options Overview: https://aws.amazon.com/ec2/pricing/
Incorrect options:
Spot Instance - A Spot Instance is an unused EC2 instance that is available for less than the On-Demand price. Because Spot Instances enable you to request unused EC2 instances at steep discounts (up to 90%), you can lower your Amazon EC2 costs significantly. Spot Instances are well-suited for data analysis, batch jobs, background processing, and other flexible tasks that can be interrupted. These can be terminated at short notice, so these are not suitable for critical workloads that need to run at a specific point in time.
Reserved Instance (RI) - Reserved Instances (RI) provide you with significant savings (up to 75%) on your Amazon EC2 costs compared to On-Demand Instance pricing. Reserved Instances (RI) are not physical instances, but rather a billing discount applied to the use of On-Demand Instances in your account. You can purchase a Reserved Instance (RI) for a one-year or three-year commitment, with the three-year commitment offering a bigger discount. Reserved instances (RI) cannot be interrupted. Reserved instances (RI) are not the right choice for short-term workloads.
Dedicated Host - Amazon EC2 Dedicated Hosts allow you to use your eligible software licenses from vendors such as Microsoft and Oracle on Amazon EC2 so that you get the flexibility and cost-effectiveness of using your licenses, but with the resiliency, simplicity, and elasticity of AWS. An Amazon EC2 Dedicated Host is a physical server fully dedicated for your use, so you can help address corporate compliance requirement. They’re not cost-efficient compared to On-Demand instances. So this option is not correct.
An e-commerce company wants to review the Payment Card Industry (PCI) reports on AWS Cloud. Which AWS resource can be used to address this use-case?
a. AWS Trusted Advisor
b. AWS Secrets Manager
c. AWS Artifact
d. AWS Cost & Usage Report (AWS CUR)
c. AWS Artifact
A retail company has multiple AWS accounts for each of its departments. Which of the following AWS services can be used to set up consolidated billing and a single payment method for these AWS accounts?
a. AWS Organizations
b. AWS Cost Explorer
c. AWS Budgets
d. AWS Secrets Manager
a. AWS Organizations
AWS Organizations helps you to centrally manage billing; control access, compliance, and security; and share resources across your AWS accounts. Using AWS Organizations, you can automate account creation, create groups of accounts to reflect your business needs, and apply policies for these groups for governance. You can also simplify billing by setting up a single payment method for all of your AWS accounts. AWS Organizations is available to all AWS customers at no additional charge.
Key Features of AWS Organizations: https://aws.amazon.com/organizations/
Which AWS service will help you deploy application code automatically to an Amazon Elastic Compute Cloud (Amazon EC2) instance?
a. AWS CloudFormation
b. AWS CodeBuild
c. AWS CodeDeploy
d. AWS Elastic Beanstalk
c. AWS CodeDeploy
AWS CodeDeploy is a service that automates application deployments to a variety of compute services including Amazon EC2, AWS Fargate, AWS Lambda, and on-premises instances. CodeDeploy fully automates your application deployments eliminating the need for manual operations. CodeDeploy protects your application from downtime during deployments through rolling updates and deployment health tracking.
Per the AWS Shared Responsibility Model, management of which of the following AWS services is the responsibility of the customer?
a. Amazon Elastic Compute Cloud (Amazon EC2)
b. AWS Elastic Beanstalk
c. Amazon DynamoDB
d. Amazon Simple Storage Service (Amazon S3)
a. Amazon Elastic Compute Cloud (Amazon EC2)
Security and Compliance is a shared responsibility between AWS and the customer. This shared model can help relieve the customer’s operational burden as AWS operates, manages, and controls the components from the host operating system and virtualization layer down to the physical security of the facilities in which the service operates.
“Security of the Cloud” is the responsibility of AWS - AWS is responsible for protecting the infrastructure that runs all of the services offered in the AWS Cloud. This infrastructure is composed of the hardware, software, networking, and facilities that run AWS Cloud services.
“Security in the Cloud” is the responsibility of the customer. Customer responsibility will be determined by the AWS Cloud services that a customer selects. This determines the amount of configuration work the customer must perform as part of their security responsibilities. For example, a service such as Amazon Elastic Compute Cloud (Amazon EC2) is categorized as Infrastructure as a Service (IaaS) and, as such, requires the customer to perform all of the necessary security configuration and management tasks. Customers that deploy an Amazon EC2 instance are responsible for the management of the guest operating system (including updates and security patches), any application software or utilities installed by the customer on the instances, and the configuration of the AWS-provided firewall (called a security group) on each instance.
Incorrect options:
Amazon Simple Storage Service (Amazon S3)
Amazon DynamoDB
AWS Elastic Beanstalk
For abstracted services, such as Amazon S3, Amazon DynamoDB and for managed services such as AWS Elastic Beanstalk, AWS operates the infrastructure layer, the operating system, and platforms, and customers access the endpoints to store and retrieve data. Customers are responsible for managing their data (including encryption options), classifying their assets, and using IAM tools to apply the appropriate permissions.
A company wants to identify the optimal AWS resource configuration for its workloads so that the company can reduce costs and increase workload performance. Which of the following services can be used to meet this requirement?
a. AWS Cost Explorer
b. AWS Systems Manager
c. AWS Budgets
d. AWS Compute Optimizer
d. AWS Compute Optimizer
AWS Compute Optimizer recommends optimal AWS resources for your workloads to reduce costs and improve performance by using machine learning to analyze historical utilization metrics. Over-provisioning resources can lead to unnecessary infrastructure costs, and under-provisioning resources can lead to poor application performance. Compute Optimizer helps you choose optimal configurations for three types of AWS resources: Amazon EC2 instances, Amazon EBS volumes, and AWS Lambda functions, based on your utilization data.
Compute Optimizer recommends up to 3 options from 140+ EC2 instance types, as well as a wide range of EBS volume and Lambda function configuration options, to right-size your workloads. Compute Optimizer also projects what the CPU utilization, memory utilization, and run time of your workload would have been on recommended AWS resource options. This helps you understand how your workload would have performed on the recommended options before implementing the recommendations.
How Compute Optimizer works: https://aws.amazon.com/compute-optimizer/
Incorrect options:
AWS Cost Explorer - AWS Cost Explorer has an easy-to-use interface that lets you visualize, understand, and manage your AWS costs and usage over time. Cost Explorer Resource Rightsizing Recommendations and Compute Optimizer use the same recommendation engine. The Compute Optimizer recommendation engine delivers recommendations to help customers identify optimal EC2 instance types for their workloads. The Cost Explorer console and API surface a subset of these recommendations that may lead to cost savings, and augments them with customer-specific cost and savings information (e.g. billing information, available credits, RI, and Savings Plans) to help Cost Management owners quickly identify savings opportunities through infrastructure rightsizing. Compute Optimizer console and its API delivers all recommendations regardless of the cost implications.
A company would like to move its infrastructure to AWS Cloud. Which of the following should be included in the Total Cost of Ownership (TCO) estimate? (Select TWO)
a. Power/Cooling
b. Application advertising
c. Electronic equipment at office
d. Number of end-users
e. Server administration
a. Power/Cooling
e. Server administration
AWS Pricing Calculator lets you explore AWS services and create an estimate for the cost of your use cases on AWS. You can model your solutions before building them, explore the price points and calculations behind your estimate, and find the available instance types and contract terms that meet your needs. This enables you to make informed decisions about using AWS. You can plan your AWS costs and usage or price out by setting up a new set of instances and services. AWS Pricing Calculator can be accessed at https://calculator.aws/#/.
AWS Pricing Calculator compares the cost of your applications in an on-premises or traditional hosting environment to AWS: server, storage, network, and IT labor. Therefore, you need to include every element relevant to these points of comparison.
Server administration is included in the IT labor costs.
Power/Cooling are included in the server, storage, and network cost.
Incorrect options:
Application advertising - The application advertising is not relevant for a Total Cost of Ownership (TCO) estimate.
Number of end-users - The number of end-users is not relevant for a Total Cost of Ownership (TCO) estimate.
Electronic equipment at office - The electronic equipment at the office is not relevant for a Total Cost of Ownership (TCO) estimate.
Which of the following statements are true about Cost Allocation Tags in AWS Billing? (Select two)
a. Tags help in organizing resources and are a mandatory configuration item to run reports
b. Only user-defined tags need to be activated before they can appear in Cost Explorer or on a cost allocation report
c. For each resource, each tag key must be unique, but can have multiple values
d. For each resource, each tag key must be unique, and each tag key can have only one value
e. You must activate both AWS generated tags and user-defined tags separately before they can appear in Cost Explorer or on a cost allocation report
d. For each resource, each tag key must be unique, and each tag key can have only one value
e. You must activate both AWS generated tags and user-defined tags separately before they can appear in Cost Explorer or on a cost allocation report
A Cost Allocation Tag is a label that you or AWS assigns to an AWS resource. Each tag consists of a key and a value. For each resource, each tag key must be unique, and each tag key can have only one value. You can use tags to organize your resources, and cost allocation tags to track your AWS costs on a detailed level.
AWS provides two types of cost allocation tags, an AWS generated tags and user-defined tags. AWS defines, creates, and applies the AWS generated tags for you, and you define, create, and apply user-defined tags. You must activate both types of tags separately before they can appear in Cost Explorer or on a cost allocation report.
AWS Cost Allocation Tags Overview: https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/cost-alloc-tags.html
Incorrect options:
Tags help in organizing resources and are a mandatory configuration item to run reports - Tags definitely help organize resources as per an organization’s requirement; they are not mandatory though.
For each resource, each tag key must be unique, but can have multiple values - For each resource, each tag key must be unique, and each tag key can have only one value.
Only user-defined tags need to be activated before they can appear in Cost Explorer or on a cost allocation report - As explained above, both kinds of tags (user-defined and AWS generated) need to be activated separately before they can appear in report generation.
A multi-national corporation wants to get expert professional advice on migrating to AWS and managing their applications on AWS Cloud. Which of the following entities would you recommend for this engagement?
a. Concierge Support Team
b. AWS Trusted Advisor
c. APN Consulting Partner
d. APN Technology Partner
c. APN Consulting Partner
The AWS Partner Network (APN) is the global partner program for technology and consulting businesses that leverage Amazon Web Services to build solutions and services for customers.
APN Consulting Partners are professional services firms that help customers of all types and sizes design, architect, build, migrate, and manage their workloads and applications on AWS, accelerating their migration to AWS cloud.
APN Partner Types Overview: https://aws.amazon.com/partners/