LFU Cache
Expert Answer & Key Takeaways
# System & Data Structure Design
Design problems in DSA interviews test your ability to translate requirements into a functional, efficient, and maintainable class structure. Unlike standard algorithmic problems, the focus here is on State Management and API Design.
### Core Principles
1. Encapsulation: Keep data private and expose functionality through well-defined methods.
2. Trade-offs: Every design choice has a cost. Is it better to have read and write, or vice versa?
3. State Consistency: Ensure that your internal data structures (e.g., a Map and a List) stay in sync after every operation.
### Common Design Patterns
#### 1. HashMap + Doubly Linked List (DLL)
The "Gold Standard" for caching (LRU/LFU).
```text
[Head] <-> [Node A] <-> [Node B] <-> [Node C] <-> [Tail]
^ ^ ^ ^ ^
(MRU) (Data) (Data) (Data) (LRU)
```
- HashMap: Provides lookups for keys to their corresponding nodes.
- DLL: Provides addition/removal of nodes at both ends, maintaining the order of access.
#### 2. Amortized Analysis (Rebalancing)
Commonly used in Queue using Stacks or Dynamic Arrays.
- Instead of doing heavy work on every call, we batch it. Pushing to a stack is , and "flipping" elements to another stack happens only when necessary, averaging per operation.
#### 3. Ring Buffers (Circular Arrays)
Used for fixed-size memory management (e.g., Circular Queue, Hit Counter).
```text
[0] [1] [2] [3] [4] [5]
^ ^ ^
Head (Data) Tail
(Pops) (Next Push)
```
- Use `(index + 1) % capacity` to wrap around the array.
#### 4. Concurrency & Thread Safety
For "Hard" design problems (e.g., Bounded Blocking Queue).
- Use Mutexes (Locks) to prevent data races.
- Use Condition Variables (`wait`/`notify`) to manage producer-consumer logic efficiently without busy-waiting.
### How to Approach a Design Problem
1. Identify the API: What methods do you need to implement? (`get`, `put`, `push`, etc.)
2. Define the State: What variables represent the current state? (Size, Capacity, Pointers).
3. Choose the Data Structures: Select the combination that minimizes time complexity for the most frequent operations.
4. Dry Run: Trace the state changes through a sequence of operations based on your chosen structure.
LFU Cache
Design a data structure that follows the constraints of a Least Frequently Used (LFU) cache.
Requirement
- get(key): Return value if exists, update frequency.
- put(key, value): Insert/update value. If at capacity, evict the least frequently used item. If there's a tie, evict the least recently used.
Examples
Input: ["LFUCache", "put", "put", "get", "put", "get", "get", "put", "get", "get", "get"]
[[2], [1, 1], [2, 2], [1], [3, 3], [2], [3], [4, 4], [1], [3], [4]]
Output: [null, null, null, 1, null, -1, 3, null, -1, 3, 4]
Approach 1
Level I: Brute Force Scan
Intuition
Store entries in a
List or Map. For every put at capacity, iterate through the entire collection to find the item with the minimum frequency and the oldest access time. This is per operation but extremely simple.⏱ O(N) for get and put.💾 O(Capacity).
Detailed Dry Run
Cache: {(A, f:2, t:1), (B, f:1, t:2)}. Put(C) evicts B because its frequency (1) is lower than A (2).
Approach 2
Level II: Priority Queue (O(log N))
Intuition
Use a
PriorityQueue to store entries sorted by frequency, and then by access time (tie-breaker). While push and pop are , it is simpler to implement than the DLL version.⏱ O(log N) for get and put.💾 O(Capacity).
Approach 3
Level III: Map Frequency to Doubly Linked List
Intuition
Maintain a
minFreq variable. Use one map for key -> node and another for freq -> DLL of nodes. When a key is accessed, move it from count DLL to count+1 DLL. If count DLL becomes empty and count == minFreq, increment minFreq. Eviction happens at freqMap[minFreq].tail.prev.⏱ O(1) for all operations.💾 O(Capacity).
Course4All Technical Board
Verified ExpertSenior Software Engineers & Algorithmic Experts
Our DSA content is authored and reviewed by engineers from top tech firms to ensure optimal time and space complexity analysis.
Pattern: 2026 Ready
Updated: Weekly
Found an issue or have a suggestion?
Help us improve! Report bugs or suggest new features on our Telegram group.