coding
Snapchat
TikTok
Meta
Snapchat Coding Interview: Cache with TTL & Capacity
Topics:
Cache Design
LRU Cache
Hash Tables
Roles:
Software Engineer
Backend Engineer
Systems Engineer
Experience:
Entry Level
Mid Level
Senior
Question Description
You are asked to implement an in-memory cache that combines per-entry time-to-live (TTL) semantics with a fixed maximum capacity. The core responsibilities are to store key→value pairs along with an expiration timestamp, make expired entries behave as if absent, and (in the second stage) ensure inserting a new non-expired entry never leaves the cache above its max capacity.
Core behavior you must support:
- set(key, value, ttl_seconds): add or update an entry and record its expiration time (now + ttl_seconds).
- get(key): return the value only if the entry exists and its TTL hasn’t expired; otherwise return None and ensure expired entries don’t count as present.
How the interview typically flows:
- Implement correct TTL semantics: store expiration timestamps, remove or lazily ignore expired items, and ensure get/set follow expiration rules.
- Add a max_capacity constructor parameter and modify set so that before inserting you free space by treating expired entries as absent and, if still full, evict at least one existing non-expired entry (eviction policy may be implementation-defined).
Skill signals the interviewer will look for:
- Data-structure choices (hash table for O(1) lookup + metadata for expiration, optional min-heap or ordered set to track soonest expiry).
- Correct handling of lazy vs. eager expiry and efficient cleanup strategies.
- Eviction approach when enforcing capacity (simple arbitrary removal vs. LRU/LFU integration) and complexity analysis.
- Concurrency considerations, testing strategies, and edge cases (zero/negative TTL, updates to existing keys, clock skew).
Focus on correctness of TTL semantics first, then choose a clear, testable eviction approach for capacity.
Common Follow-up Questions
- •How would you change the design to evict the least-recently-used (LRU) item instead of an arbitrary item when capacity is exceeded?
- •Describe strategies to make this cache thread-safe for concurrent get/set calls; what locks or lock-free approaches would you use and why?
- •How can you efficiently expire items without scanning the entire hash table (e.g., using a min-heap or timing wheel)? Discuss time and space trade-offs.
- •If TTL values can be updated on get (touch semantics), how does that affect eviction and the data structures you choose?
- •How would you extend this single-node cache to a distributed cache with consistent TTL semantics across nodes?
Related Questions
1Implement an LRU cache with get/put and fixed capacity
2Design an expiring key-value store that supports range eviction and TTL queries
3Implement an LFU cache that also supports per-item TTL
4Design a rate limiter using a sliding window or token bucket with expiration semantics
Explore More Questions
Practice This Question with AI
Get real-time hints, detailed requirements, and insightful analysis of the question.