Practice Test III - WhizLabs Flashcards

1
Q

AWS CodeDeploy is used to configure a deployment group to automatically roll-back to last known good revision when a deployment fails. During roll-back, files required for deployment to earlier revision cannot be retrieved by CodeDeploy. Which of the following actions can be executed for successful roll-back. Choose 2.

A. Use Manual rollback instead of automatic rollback.
B. Manually add required files to instance.
C. Use an existing application revision.
D. Map CodeDeploy to access those files from S3 buckets.
E. Create a new application revision.

A

B & E. Create a new application revision and manually add required files to instance.

During AWS CodeDeploy automatic rollback, it will try to retrieve files that were part of previous versions. If these files are deleted or missing, you need to manually add those files to instance or create a new application revision.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How does a CodeDeploy rollback work?

A

CodeDeploy rolls back deployments by redeploying a previously deployed revision of an application as a new deployment. These rolled-back deployments are technocally new deployments, with new deployment IDs, rather than restored versions of a previous deployment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

You have a legacy application that processes messages from an SQS queue. Application uses a single thread to poll multiple queues, which of the following polling timeout will be the best option to avoid latency in processing messages?

A. Use short polling with default visibility timeout values.
B. Use long polling with higher visibility timeout values.
C. Use long polling with lower visibility timeout values.
D. Use short polling with higher visibility timeout values.

A

A. Use short polling with default visibility timeout values.

In this case, application is polling multiple queues with a single thread. Long polling will wait for message or timeout values for each queue, which may delay processing of messages in other queues which has messages to be processed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the required configurations for setting up a bucket for static website hosting?

A

Enabling website hosting
Configuring index document support
Permissions required for website access

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

You are developing an application that will make use of Kinesis Firehose for streaming the records onto S3. Your company policy mandates that all data needs to be encrypted at rest. How can you achieve this with Kinesis Firehose? Choose 2.

A. Enable encryption on the Kinesis Data Firehose.
B. Install an SSL certificate in Kinesis Data Firehose.
C. Ensure that all data records are transferred via SSL.
D. Ensure that Kinesis streams are used to transfer the data from the producers.

A

A & D. Enable encryption on the Kinesis Data Firehose and Ensure that Kinesis streams are used to transfer the data from the producers.

If you have sensitive data, you can enable server-side encryption when you use Kinesis Data Firehose. However, this is only possible if you use a Kinesis stream as your data source. When you configure a Kinesis stream as the data source of a Kinesis Data Firehose delivery stream, Kinesis Data Firehose no longer stores the data at rest. Instead, the data is stored in the Kinesis stream.

Options B & C are invalid because this is used for encrypting data in transit.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What’s the difference between Kinesis Streams and Kinesis Firehose?

A

With Kinesis Streams, you can store the data for up to 7 days, but with Kinesis Data Firehose, you just send the data directly to, e.g. S3. You may use Kinesis Streams if you want to do some custom processing with streaming data. With Kinesis Firehose, you are simply ingesting it into S3, Redshift or ElasticSearch.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

You have been told to make use of CloudFormation templates for deploying applications on EC2 instances. These instances need to be preconfigured with the NGINX web server to host the application. How could you accomplish this with CloudFormation?

A

You can use the cfn-init helper script in CloudFormation.

When you launch stacks, you can install and configure software applications on EC2 instances by using the cfn-init helper script and the AWS::CloudFormation::Init resource. By using AWS::CloudFormation::Init, you can describe the configurations that you want, rather than scripting procedural steps.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

You are working on building microservices using Amazon ECS. This ECS will be deployed in an EC2 instance along with its ECS container agent. After the successful launching of the EC2 instance, the ECS container agent has registered this instance in a cluster. What would the status be of the container instance and its corresponding agent connection, when an ECS container instance is stopped?

A

Container instance status remains as ACTIVE and agent connection status is FALSE.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is an Amazon ECS container instance?

A

An ECS container instance is an EC2 instance that is running the ECS container agent and has been registered into a cluster.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is CORS?

A

Cross-origin resource sharing, CORS, is a browser security feature that restricts cross-origin HTTP requests that are initiated from scripts running in your browser. If your REST API’s resources receive non-simple cross-origin HTTP requests, you need to enable CORS supprt.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

For simple cross-origin POST requests, what does the response from your resource need to include?

A

It needs to include the header Access-Control-Allow-Origin, where the value of the header key is set to ‘*’ or is set to the origins allowed to access that resource.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

To support CORS for non-simple HTTP requests, what would a REST API resource need to implement?

A

When a browser receives a non-simple HTTP request, the CORS protocol requires the browser to send a preflight request to the server and wait for approval (or a request for credentials) from the server before sending the actual request. The preflight request appears to your API as an HTTP request that:

Includes an Origin header.
Uses the OPTIONS method.
Includes the following headers: Access-Control-Request-Method, Access-Control-Request-Headers.

To support CORS, therefore, a REST API resource needs to implement an OPTIONS method that can respond to the OPTIONS preflight request with at least the following response headers:

Access-Control-Allow-Methods
Access-Control-Allow-Headers
Access-Control-Allow-Origin

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What happens when DynamoDB with a DAX cluster receives a strongly consistent read request from an application?

A

For strongly consistent read requests from an application, DAX cluster passes all requests to DynamoDB and does not cache the results.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the DAX item cache?

A

DAX maintains an item cache to store the results from GetItem and BatchGetItem operations. The items in the cache represent eventually consistent data from DynamoDB, and are stored by their primary key values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the DAX query cache?

A

DAX maintains a query cache to store the results from query and scan operations. The items in this cache represent result sets from queries and scans on DynamoDB tables. These result sets are stored by their parameter values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

You are developing an application that is working with a DynamoDB table. You need to create a query that has search criteria. What must be done in order to work with search queries?

A

To specify the search criteria, you use a key condition expression (a string that determines the items to be read from the table or index). You must specify the partition key name and value as an equality condition. You can optionally provide a second condition for the sort key.

17
Q

You are working on an application which saves strings in DynamoDB table. For strings with size greater than 400KB, you are getting item size exceeded error. What’s a recommended option to store strings with larger size?

A

You can use S3 to store large attribute values that cannot fit in a DynamoDB item. You can store them as an object in S3 and then store the object identifier in your DynamoDB item.

18
Q

You are developing an application that is going to make use of Docker containers. Traffic needs to be routed based on demand to the application. Dynamic host port mapping would be used for the docker containers. Which of the following would you use for distribution of traffic to the docker containers?

A. AWS Application Load Balancer
B. AWS Network Load Balancer
C. AWS Route 53
D. AWS Classic Load Balancer

A

A. AWS Application Load Balancer.

Application Load Balancers allow containers to use dynamic host port mappings (so that multiple tasks from the same service are allowed per container instance). They also support path-based routing and priority rules (so that multiple service can use the same listener port on a single Application Load Balancer).

19
Q

What is an application load balancer?

A

Application load balancer operates at the request layer (layer 7), routing traffic to targets - EC2 instances, containers, IP addresses and Lambda functions, based on the content of the request. It’s ideal for advanced load balancing of HTTPS and HTTP traffic.

20
Q

What is a network load balancer?

A

Network Load Balancer operates at the connection level (layer 4), routing connections to targets - EC2 instances, microservices and containers - within Amazon VPC based on IP protocol data. Ideal for load balancing of both TCP and UDP traffic, Network Load Balancer is capable of handling millions of requests per second while maintaining ultra-low latencies. Network Load Balancer is optimized to handle sudden and volatile traffic patterns while using a single static OP address per AZ.

21
Q

What’s the difference between an application load balancer and a network load balancer?

A

The first difference is that the ALP works at the application layer (layer 7 of the OSI model) and that the NLB works at layers 3 & 4 of the OSI model. The NLB just forward requests whereas the ALB examines the contents of the HTTP request header to determine where to route the request. So, the ALB is performing content based routing.

Another difference is that NLB cannot assure the availability of the application. This is because it bases its decisions solely on network and TCP-layer variables and has no awareness of the application at all. Generally, a network load balancer will determine “availability” based on the ability of a server to respond to ICMP ping, or to correctly complete the three-way TCP handshake. An ALB goes much deeper, and is capable of determining availability based on not only a successful HTTP GET of a particular page but also the verification that the content is as expected based on the input parameters.

Application Load Balancer is ideally used when you have the requirement for path-based routing.

22
Q

What is service load balancing?

A

Your Amazon ECS servica can optionally be configured to use Elastic Load Balancing to distribute traffic evenly across the tasks in your service.

ECS services support the Application Load Balancer, Network Load Balancer and Classic Load Balancer types. Application Load Balancers are used to route HTTP/HTTPS (or layer 7) traffic. Network Load Balancers and Classic Load Balancers are used to route TCP (or layer 4) traffic.

It’s recommended to use Application Load Balancers for your ECS services, unless your service requires a feature that is only available with Network Load Balancers or Classic Load Balancers.

23
Q

You are working on an application which provides an online car booking service using DynamoDB. This is a read-heavy application which reads car & driver location details and provides a latest position to prospective car booking customers. Which of the following can be used to have consistent data writes and avoid unpredictable spiked in DynamoDB requests during peak hours?

A. Write Around Cache using DAX
B. Write Through Cache using DAX
C. Use side cache using Redis along with DynamoDB.
D. Write Through cache using Redis along with DynamoDB.

A

B. Write through cache using DynamoDB DAX.

DAX is intended for applications that require high-performance reads. As a write-through cache, DAX allows you to issue writes directly, so that your writes are immediately reflected in the item cache. You do not need to manage cache invalidation logic, because DAX handles it for you.

24
Q

How do you use DynamoDB encryption at rest?

A

You have to specify it at the time of table creation. When creating a new table, you can choose one of the following customer master keys (CMK) to encrypt your table:

AWS owned CMK. This is the default encryption key. The key is owned by DynamoDB (no additional charge).
AWS managed CMK. The key is stored in your account and is managed by AWS KMS (KMS charges apply).
Customer managed CMK. The key is stored in your account and is created, owned and managed by you. You have full control over the CMK (KMS charges apply).

25
Q

You’ve just started developing an application. This application will interact with S3 and DynamoDB. How would you as the developer ensure that your SDK can interact with the AWS services on the cloud?

A

Create an IAM user, generate the access keys, and then use the access keys from within your workstation.

26
Q

You are working as a team lead for your company. You have been told to manage the Blue Green deployment methodology for one of the applications. Which of the following are some of the approaches for implementing this methodology? Choose 2.

A. Use Autoscaling Groups to scale on demands for both deployments.
B. Use Route 53 with weighted routing policies.
C. Use Route 53 with latency routing policies.
D. Use Elastic Beanstalk with the swap URL feature.

A

B & D. Use route 53 with weighted routing policies; Use Elastic Beanstalk with the swap URL feature.

Weighted routing lets you associate multiple resources with a single domain name (example.com) or subdomain name (acme.example.com) and choose how much traffic is routed to each resource. This can be useful for a variety of purposes, including load balancing and testing new versions of software.

Because AWS Elastic Beanstalk performs an in-place update when you update your application versions, your application can become unavailable to users for a short period of time. You can avoid this downtime by performing a blue/green deployment, where you deploy the new version to a separate environment, and then swap CNAMEs of the two environments to redirect traffic to the new version instantly.