ElastiCache & Caching Strategies
Difficulty: medium
Overview
Amazon ElastiCache is a fully managed in-memory caching service (Redis and Memcached).
Redis vs Memcached:
| Feature | Redis | Memcached |
|---|---|---|
| Data structures | Strings, lists, sets, sorted sets, hashes | Strings only |
| Persistence | Yes (RDB, AOF) | No |
| Replication / Multi-AZ | Yes | No |
| Pub/Sub | Yes | No |
| Multi-threaded | No | Yes |
Caching Strategies:
- Lazy Loading: Check cache first; on miss, fetch DB and populate. Risk: stale data.
- Write-Through: Write to cache on every DB write. Always current. Risk: write penalty.
- TTL: Expire cache entries after set period. Always combine with lazy loading.
- Write-Behind: Write to cache, async write to DB. Lowest write latency. Risk: data loss.
Cache Eviction (Redis): allkeys-lru (most common), volatile-lru, allkeys-lfu, noeviction.
Redis Cluster Mode:
- Disabled: One shard, primary + up to 5 replicas. Multi-AZ auto-failover.
- Enabled: Data partitioned across multiple shards. Up to 500 nodes. Scale reads and writes.
Session Store Pattern: Store sessions in Redis. Stateless EC2 instances. Sessions shared across fleet.
Practice Linked Questions
Q1. A developer is choosing between ElastiCache for Redis and ElastiCache for Memcached for a session store that must persist data across node restarts. Which should be chosen and why?
Select one answer before revealing.
Q2. A developer implements a caching layer using ElastiCache for Redis. When a cache miss occurs, the application queries the database and writes the result to the cache. What caching strategy is this?
Select one answer before revealing.
Q3. A developer uses ElastiCache for Redis to cache database query results. After deploying a new version of the application that uses different query logic, stale data is returned. Which approach correctly invalidates affected cache entries?
Select one answer before revealing.