Elasticache Flashcards
What are some key differences between Reddis and Memcached?
Memcached: Key/Value pairs. Multithreaded. Good for simpler cache use cases.
Reddis: Complex data types and structures. Can be persisted to disk, can sort and rank data.
What are the 2 engines used by elasticache
Reddis and Memcached
Can a write through cache be stale?
No. With a write through cache as data is written to the database, it is also written to the cache - the cache is therefore always current
Is Elasticache multi-AZ for both Reddis and Memcached?
yes to both
I need a multithreaded in memory cache - would I use Memcached or Reddis?
Memcached is multi threaded
Which cacheing engine supports complex datatypes, ket store persistence and read replicas for read intensive applications?
Reddis
Would we expect reads in an application using write through cache to be faster or slower than those for lazy loading?
Faster - as cache is kept up to date
What is a DISADVANTAGE of using lazy loading (aka cache aside/lazy population)
Data can become stale when using a lazy loading strategy. As lazy loading is read based, if it finds a hit in the cache it will use that - however if the underlying data in the database has changed, this data will be out of date.
Also, there is a read penalty that is incurred with lazy loading.
For lazy loading, how many round trips occur and to where when there is a cache miss?
- First round trip is to the cache to return the miss. Second round trip is to the database to select the data. Third round trip is back to the cache to update it.
Describe this code and the caching strategy it reflects:
def get_user(user_id) record=cache.get(user_id) if record is None: record=db.query("select * from users where Id=?", user_id ) cache.set(user_id, record) return record else: return record
user=get_user(12)
This represents a lazy loading cache strategy. Here, we attempt to retrieve the user_id from the cache. If its a miss (record is None) we execute a select against the db and write the result back to the cache and then return it from the function
Does memcached allow for data persistence?
No, it is in memory only. If the node goes down the data is lost
What scaling actions will you need to take if you are seeing a large number of cache evictions due to memory constraints?
Its likely you will need to scale up or scale out.
Who is responsible for a cache invalidation strategy?
As the developer - you are.
When starting a memcached elasticache cluster, will the cache be empty. What about for reddis?
Memcached will always start empty. Reddis can be restored from backup
Which elasticache engine is multi-threaded
Memcached
Both Elasticache and DynamoDB can be used to store session data for a web application, and both use key value pairs. Given this, why would you use one over the other?
you would use Elasticache if you specifically needed an in memory datastore.
You would use DynamoDB if you needed a SERVERLESS solution that can scale automatically.
Which elasticache engine supports read replicas and which supports sharding?
Reddis supports read replicas, Memcached supports sharding
Name 2 use cases for elasticache
DB offload (for reads), Session management (for stateless applications)
What are TWO disadvantages of a write through cache?
- While data cannot be stale - it can be missing until it is added or updated. This can be mitigated by using write through cache in conjunction with lazy loading
- Cache Churn - a lot of data will be written to cache, but it may never be read. This can be a problem if you have a small cache.
How many memcached nodes can you have per cluster. How many for Reddis? Which supports replication groups?
20 for memcached, 1 for reddis. Reddis clusters can be grouped into replication groups
What are the 2 key caching strategies you can use
Lazy Loading and Write Through
There are 3 ways to evict an item from cache. What are they?
Specifically Delete an Item
Eviction due to high memory utilisation
TTL
What sort of penalty does lazy loading incur - read or write?
Read due the 3 round trips needed
Name two ADVANTAGES of lazy loading
- You only have required data in the cache so it isn’t filled with un-used data. This is because the cache is populated on a MISS
- Node failures are not fatal. If a node goes down, there will increased latency as the nodes are “warmed” (i.e. caches are repopulated)
Describe this code and the caching strategy it reflects:
def save_user(user_id, values) record=db.query("update users ... where id=?", user_id, values) cache.set(user_id, record) return record
user=save_user(17, (“name”: “Scott Tiger”))
This reflects a write through cacheing strategy. As soon as a record is written to the database, it is also written to the cache
You need to implement a caching solution within your application that allows allows for backup and restore. Would you use memcached or reddis?
Reddis supports backup and restore
If I wanted a caching engine which allowed for pub/sub capability - which would I use, memcached or redis?
Redis.
Which caching engine allows for horizontal scalability and why?
memcached as it is multithreaded and allows for 20 nodes per cluster.
I need a caching engine that allows for multi-az operation for fail over. Which would I use?
Redis.