coding
Oracle
Amazon
Google

LRU Cache Implementation - Oracle Coding Interview

Topics:
LRU Cache
Hash Table
Doubly Linked List
Roles:
Software Engineer
Backend Engineer
SWE
Experience:
Entry Level
Mid Level
Senior

Question Description

Implement a fixed-capacity Least Recently Used (LRU) cache and discuss Java-specific design considerations.

You will build an LRU cache class that exposes the standard get(key) and put(key, value) operations. get should return the stored value or -1 if the key is absent and must mark an accessed entry as recently used. put must insert or update a key, mark it as recently used, and—when capacity is reached—evict the least recently used entry before inserting a new key. Keys and values are integers for the coding stage and capacity is a positive integer set at construction.

Interview flow: you typically start by describing a target API and invariants (O(1) access, correct LRU ordering), propose a data structure (hash table + doubly linked list), sketch node layout and operations, then implement get/put with careful pointer updates. After a correct implementation, expect follow-ups on complexity, corner cases (capacity 0, duplicate puts), and Java-specific variants.

Skill signals you should show: O(1) time reasoning, ability to maintain ordering with a doubly linked list, correct hash table usage for direct lookup, attention to edge cases and memory/cycle concerns. For Java follow-ups, be prepared to discuss collection choices (LinkedHashMap vs custom), thread-safety strategies (synchronization, ConcurrentHashMap, locks), GC and reference types, equals()/hashCode() implications, and how to add metrics or persistence without breaking core semantics.

Common Follow-up Questions

  • How would you adapt your LRU cache to be thread-safe in Java? Compare synchronized blocks, ReentrantLock, and using ConcurrentHashMap with additional synchronization.
  • Explain why the combined hash table + doubly linked list provides O(1) get and put. Provide a brief proof or argument about time and space complexity.
  • How do equals() and hashCode() implementations for arbitrary key objects affect your cache correctness? What invariants must callers satisfy?
  • If capacity were allowed to be zero or negative, how would your implementation behave and how would you handle those edge cases in production code?
  • How would you extend the cache to support metrics (hit/miss counts), TTL (time-to-live) for entries, or persistence without violating LRU semantics?

Related Questions

1Implement an LFU (Least Frequently Used) cache with O(1) operations
2Use Java's LinkedHashMap to implement an LRU cache: trade-offs and example
3Design a thread-safe in-memory cache that supports TTL and eviction
4Design a distributed caching layer (e.g., memcached/Redis style) with LRU eviction
5Implement get/put with O(1) operations using hash map and linked list: common interview pitfalls

Explore More Questions

Practice This Question with AI

Get real-time hints, detailed requirements, and insightful analysis of the question.

LRU Cache Implementation - Oracle Coding Interview | Voker