***NOT READY FOR STUDY*** AWS DEV-A Practice Exam 3 Flashcards
Time to Live (TTL) for Amazon DynamoDB lets you ________ when items in a table ________ so that they can be automatically deleted from the database.
define ; expire
When Time to Live (TTL) is enabled on a table in Amazon DynamoDB, a background job _____ __ _______ attribute of items to determine whether they are expired.
checks the TTL
CodeBuild ________ your source code, ____ unit tests, and ________ artifacts that are ready to deploy.
compiles ; runs ; produces
CodeDeploy is a deployment service that automates application deployments to Amazon EC2 instances, on-premises instances, serverless Lambda functions, or Amazon ECS services
CodeDeploy is a deployment service that automates application deployments to Amazon EC2 instances, on-premises instances, serverless Lambda functions, or Amazon ECS services
A Developer needs to create an instance profile for an Amazon EC2 instance using the AWS CLI. How can this be achieved? (Select THREE.)
aws iam create-instance-profile –instance-profile-name EXAMPLEPROFILENAME
aws iam add-role-to-instance-profile –instance-profile-name EXAMPLEPROFILENAME –role-name EXAMPLEROLENAME
aws ec2 associate-iam-instance-profile –iam-instance-profile Name=EXAMPLEPROFILENAME –instance-id i-012345678910abcde
Lambda: In synchronous invocations, the caller
waits for the function to complete execution and the function can return a value.
Lambda: In asynchronous operation, the caller
places the event on an internal queue, which is then processed by the Lambda function.
A Lambda authorizer (formerly known as a custom authorizer) is an API Gateway feature that uses a Lambda function to control access to your API.
A Lambda authorizer (formerly known as a custom authorizer) is an API Gateway feature that uses a Lambda function to control access to your API.
A Lambda authorizer is useful if you want to implement a custom authorization scheme that uses a bearer token authentication or that uses request parameters to determine the caller’s identity.
A Lambda authorizer is useful if you want to implement a custom authorization scheme that uses a bearer token authentication or that uses request parameters to determine the caller’s identity.
Data key caching can improve performance, reduce cost, and help you stay within service limits as your application scales.
Data key caching can improve performance, reduce cost, and help you stay within service limits as your application scales.
How does Data key caching keep you from exceeding KMS service limits?
When you encrypt or decrypt data, the AWS Encryption SDK looks for a
matching data key in the cache. If it finds a match, it uses the cached data key rather than generating a new one.
Which is more secure, an AWS or Customer Managed IAM policy?
The customer-managed policy is more secure since it can be locked down with more granularity down to the
AWS-managed policy always provides more privileges than required.
You should initialize SDK clients and database connections outside of the function handler, and cache static assets locally in the /tmp directory.
You should initialize SDK clients and database connections outside of the function handler, and cache static assets locally in the /tmp directory.
You can point an alias to multiple versions of your function code and then assign a weighting to direct certain amounts of traffic to each version.
You can point an alias to multiple versions of your function code and then assign a weighting to direct certain amounts of traffic to each version.
The write-through strategy adds data or updates data in the cache whenever data is written to the database.
The write-through strategy adds data or updates data in the cache whenever data is written to the database.
The advantages of write-through as a writing policy are:
Data in the cache is never stale. Because the data in the cache is updated every time it’s written to the database.
Write penalty vs. read penalty.
Lazy loading is a caching strategy that loads data into the cache only when necessary.
Lazy loading is a caching strategy that loads data into the cache only when necessary.
Concurrency is the number of requests that your function is serving at any given time
Concurrency is the number of requests that your function is serving at any given time
If the function is invoked again while a request is still being processed, another instance is allocated, which increases the function’s concurrency.
If the function is invoked again while a request is still being processed, another instance is allocated, which increases the function’s concurrency.
When requests come in faster than your function can scale, or when your function is at maximum concurrency, additional requests fail with a throttling error (429 status code).
When requests come in faster than your function can scale, or when your function is at maximum concurrency, additional requests fail with a throttling error (429 status code).
Concurrency is subject to a Regional limit that is shared by all functions in a Region. What is the concurrency limit in
US West (Oregon), US East (N. Virginia), Europe (Ireland)
3000
To calculate the concurrency requirements, multiply the invocation ___________ per second (50) with the average ___________ ______ in seconds.
To calculate the concurrency requirements, multiply the invocation requests per second (50) with the average execution time in seconds.