Module 11: Caching Content Flashcards
What is caching?
- A high-speed data storage layer
- A way to store passwords
- A global network for content distribution
- An in-memory database
(The answer is not “An in-memory database”.
A cache contains partial data, and is not considered a database.)
Which types of data should you cache?
- Data that can be retrieved quickly with simple queries
- Dynamically generated web content
- Static data that is frequently accessed
- Specialized data that is needed by a subset of users
- Static data that is frequently accessed
( Good candidates for caching data include data that does not change often and that uses access frequently.)
What is a benefit of caching?
- Reduced response latency
- Load balancing the application
- Increased application reliability
- Decreased costs
- Reduced response latency
(Caching can reduce or eliminate slow resources between the requester and the content. It can reduce network hops, eliminate the need to communicate over slow links or through slow devices, or reduce load on the origin.)
What does Amazon CloudFront enable?
- Bidirectional caching between users and an origin host
- Multi-tiered and regional caching of content
- Transactional processing with an in-memory database
- Automatic creation of a time-to-live value
- Multi-tiered and regional caching of content
( CloudFront provides a multi-tiered cache so that you can have multiple rules for when content expires. It also provides Regional caching, which reduces latency by reducing the number of network hops.)
How does Amazon CloudFront use edge locations?
- It caches all content from origin distribution at the edge location, and delivers the content to clients through the fastest edge location.
- It caches local content at edge locations. It delivers the cached content to clients through the edge location that requires the fewest network hops to reach those clients.
- It cache frequently accessed content at edge locations. It delivers the cached content to clients through the edge location with the lowest latency to those clients.
- It caches Regional data at Regional edge locations, and delivers the content to clients through their Regional edge locations.
(The answer is not, “ It caches local content at edge locations. It delivers the cached content to clients through the edge location that requires the fewest network hops to reach those clients.”
Although having fewer hops, often results in lower latency, it is not always true. Latency, not network hops, determines the nearest edge location.)
Where is application session data cached when using sticky sessions?
- Web server
- Web browser
- Elastic Load Balancing load balancer
- Amazon CloudFront
- Web server
Sticky sessions is a feature of Elastic Load Balancing. It directs repeated requests (from the same client) to the same server, which enables the server to cache session data
( The answer is not, “Web browser”.
A web browser can maintain session information, but does so by using mechanisms such as HTTP cookies. Sticky sessions enables an application to avoid using browser-based cookies.)
Which statement best describes an efficient way to deliver on-demand streaming content by using Amazon CloudFront?
- CloudFront does not work with streaming content.
- A best practice is to create separate origin servers for each Region where you serve streaming content.
- A best practice is to create distributions for each Region where you serve content.
- A best practice is to create video segments and store them in an Amazon S3 bucket. Then, use CloudFront to cache the segments.
- A best practice is to create video segments and store them in an Amazon S3 bucket. Then, use CloudFront to cache the segments.
(First create video segments with an encoder and store them in Amazon S3, which is the origin for CloudFront. Then, create distributions for the video content, which CloudFront can cache. This method is the most operationally efficient.)
( The answer is not “ A best practice is to create separate origin servers for each Region where you serve streaming content.”
This architecture would be difficult to maintain and is not necessary when you use CloudFront.)
What is Amazon DynamoDB Accelerator (DAX)?
- A fully managed, highly available in-memory cache for DynamoDB
- A feature of DynamoDB that automatically adjusts read/write capacity to handle load
- A fully managed, highly available cache that is backed by DynamoDB
- A feature of DynamoDB that enables fast lookup of items by using secondary keys
- A fully managed, highly available in-memory cache for DynamoDB
(DAX delivers a performance improvement of up to 10 times, even at millions of requests per second. It does not require cache invalidation management, data population, or cluster management.)
How can an application use Amazon ElastiCache to improve database read performance? (Select TWO.)
- Read data from the database first and write the most frequently read data to ElastiCache hen a cache miss occurs.
- Direct all read requests to the database and configure it to read from Elasticache when a cache miss occurs.
- Read data from ElastiCache first and write to ElastiCache when a cache miss occurs.
- Write data to ElastiCache whenever the application writes to the database.
- Replicate the database in ElastiCache, and direct all reads to ElastiCache and all writes to the database.
- Read data from ElastiCache first and write to ElastiCache when a cache miss occurs.
- Write data to ElastiCache whenever the application writes to the database.
(Writing data to the cache only when a cache miss occurs is called lazy loading. Writing data to the cache every time that the application writes data to the database is called write-through.)
(The answer is not a combination of:
“ Read data from the datbase first and write the most frequently read data to ElastiCache hen a cache miss occurs” and
“ Direct all read requests to the database and configure it to read from Elasticache when a cache miss occurs.”
An application that always reads from the database first is not using the cache. The application must fetch the data from the database when a cache miss occurs. Offloading all read requests to a replica of a database is the job of the read replica.)
Which role does Amazon CloudFront play in protecting against distributed denial of service (DDoS) attacks?
- Routes traffic through edge locations
- Controls traffic by the source IP address of requests
- Restricts traffic by geography to help block attacks that originate from specific countries
- Performs deep packet inspection to detect attacks
(The correct answer is not “Controls traffic by the source IP address of requests”.
This action is a function of AWS WAF. It can be used together with CloudFront and Amazon Route 53 to help stop DDoS attacks.)