Caching Layers: Flashcards

1
Q

What is caching in the context of databases?

A

Caching stores frequently accessed data in a fast, in-memory store (like Redis or Memcached) to reduce database load and improve performance by avoiding repeated database queries.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Why would you use caching in a system?

A

Caching reduces the time to fetch data, decreases database load, and enhances system performance, especially for frequently accessed or computationally expensive data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the difference between Redis and Memcached?

A

Redis: Supports complex data structures like lists, sets, and sorted sets, and provides persistence.

Memcached: Simple key-value store, better suited for caching purely in-memory with no persistence.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

When is Redis preferred for caching?

A

Redis is preferred when you need more advanced data structures, persistence, or features like pub/sub and replication, in addition to basic caching.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

When is Memcached preferred for caching?

A

Memcached is better for simple, high-performance caching of small key-value pairs when persistence and complex data structures are not needed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is write-through caching?

A

In write-through caching, data is written to both the cache and the database simultaneously. It ensures the cache is always up-to-date but adds latency to writes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is write-behind caching?

A

In write-behind (write-back) caching, data is written to the cache first and asynchronously written to the database later, improving write performance but increasing the risk of data loss in case of cache failure.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are cache eviction policies?

A

Cache eviction policies determine which data is removed from the cache when it reaches its capacity. Common strategies include LRU (Least Recently Used) and LFU (Least Frequently Used).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the Least Recently Used (LRU) eviction policy?

A

LRU eviction removes the data that has not been accessed for the longest time when the cache reaches its capacity. It assumes that data not recently used is less likely to be used soon.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the Least Frequently Used (LFU) eviction policy?

A

LFU eviction removes the data that has been accessed the fewest number of times. It assumes that less frequently used data is less important.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the Cache Aside strategy?

A

In Cache Aside, the application checks the cache first. If the data is not found (cache miss), the application loads it from the database, stores it in the cache, and then serves the request.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are the pros and cons of write-through caching?

A

Pros:
- Always consistent between cache and database.
- No stale data in the cache.

Cons:
- Higher write latency due to simultaneous database and cache writes.
- May increase database load with writes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What are the pros and cons of write-behind caching?

A

Pros:
- Lower write latency.
- Faster write operations.

Cons:
- Risk of data loss if the cache fails before the data is written to the database.
- More complex to manage.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are the pros and cons of the LRU eviction policy?

A

Pros:
- Simple to implement.
- Works well for workloads where recently accessed data is likely to be accessed again.

Cons:
- May evict frequently used but older data.
- Can be inefficient for workloads with varying access patterns.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What are the pros and cons of the LFU eviction policy?

A

Pros:
- Keeps the most popular data in the cache.
- Efficient for workloads where frequently accessed data remains important over time.

Cons:
- More complex to implement.
- May keep old data that is no longer relevant but was frequently accessed in the past.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is cache invalidation?

A

Cache invalidation is the process of removing or updating stale data in the cache to ensure that outdated information is not served to clients. This can be triggered manually or automatically based on expiration times.

17
Q

What is a Time-to-Live (TTL) in caching?

A

TTL is a cache expiration setting that automatically removes cached data after a specified period, ensuring that stale data is not served to users.

18
Q

What is the difference between a cache hit and a cache miss?

A
  • Cache Hit: Data is found in the cache, and the request is served from the cache.
  • Cache Miss: Data is not found in the cache, and the request is served from the database.
19
Q

What is the difference between a hot and cold cache?

A
  • Hot Cache: The cache has been populated with frequently requested data, leading to high cache hit rates.
  • Cold Cache: The cache is empty or has little useful data, leading to more cache misses.
20
Q

What is a cache stampede, and how can it be prevented?

A

A cache stampede occurs when multiple requests try to retrieve the same data after a cache miss, overwhelming the database. It can be mitigated by locking the cache during data fetching or using techniques like request coalescing to prevent duplicate fetches.