Implementing an LRU Cache in C#: A Step-by-Step Guide

Çağlar Can SARIKAYA
3 min readAug 12, 2024

--

In modern software development, efficient data management is crucial, especially when dealing with resources like memory and processing power. One common technique used to optimize memory usage and improve performance is the implementation of an LRU (Least Recently Used) cache. This article will guide you through creating a simple yet effective LRU Cache in C#, demonstrating how you can manage memory efficiently in your applications.

What is an LRU Cache?

An LRU cache is a type of caching algorithm that discards the least recently used items first. It is particularly useful when you have a limited amount of memory and need to prioritize the retention of more frequently accessed data. The LRU Cache keeps track of the order in which items are accessed and evicts the least recently accessed item when the cache reaches its capacity.

Building the LRU Cache in C#

To build an LRU Cache in C#, we’ll utilize a `Dictionary` for fast lookups and a `LinkedList` to maintain the order of usage. Here’s a breakdown of the core components:

1. Capacity: The maximum number of items the cache can hold.
2. Dictionary: A collection that maps keys to `LinkedList` nodes, allowing O(1) time complexity for get and set operations.
3. LinkedList: A doubly-linked list that stores the items in the order of their usage. The most recently used item is moved to the front, while the least recently used item is at the back.

The LRUCache Class

Let’s dive into the implementation of the `LRUCache` class:


using System;
using System.Collections.Generic;

public class LRUCache<TKey, TValue>
{
private readonly int _capacity;
private readonly Dictionary<TKey, LinkedListNode<KeyValuePair<TKey, TValue>>> _cacheMap;
private readonly LinkedList<KeyValuePair<TKey, TValue>> _lruList;

public LRUCache(int capacity)
{
_capacity = capacity;
_cacheMap = new Dictionary<TKey, LinkedListNode<KeyValuePair<TKey, TValue>>>();
_lruList = new LinkedList<KeyValuePair<TKey, TValue>>();
}

public TValue Get(TKey key)
{
if (_cacheMap.TryGetValue(key, out LinkedListNode<KeyValuePair<TKey, TValue>> node))
{
_lruList.Remove(node);
_lruList.AddFirst(node); // Move the most recently used item to the front
return node.Value.Value;
}
throw new KeyNotFoundException("Key not found");
}

public void Set(TKey key, TValue value)
{
if (_cacheMap.TryGetValue(key, out LinkedListNode<KeyValuePair<TKey, TValue>> node))
{
_lruList.Remove(node);
}
else if (_cacheMap.Count >= _capacity)
{
var lru = _lruList.Last;
_cacheMap.Remove(lru.Value.Key);
_lruList.RemoveLast(); // Remove the least recently used item
}

var newNode = new KeyValuePair<TKey, TValue>(key, value);
var listNode = new LinkedListNode<KeyValuePair<TKey, TValue>>(newNode);
_lruList.AddFirst(listNode);
_cacheMap[key] = listNode;
}
}

How It Works

- Constructor: The constructor initializes the cache with a given capacity, a dictionary for quick lookups, and a linked list to maintain the usage order.
- Get Method: The `Get` method retrieves a value associated with a key. If the key exists, it moves the corresponding node to the front of the linked list, indicating that it has been recently accessed. If the key does not exist, a `KeyNotFoundException` is thrown.
- Set Method: The `Set` method adds or updates a key-value pair in the cache. If the key already exists, the old value is updated and the node is moved to the front. If the cache is full, the least recently used item (the last node in the list) is removed before adding the new item to the front.

Why Use a LinkedList?

The choice of a `LinkedList` is crucial here. It allows us to efficiently move nodes to the front of the list or remove them from the back with O(1) time complexity. A regular list (`List<T>`) would require O(n) operations to shift elements around, which could lead to performance bottlenecks, especially with large datasets.

Usage Example

Here’s how you can use the `LRUCache` in a C# application:

var lruCache = new LRUCache<int, string>(3);

lruCache.Set(1, "one");
lruCache.Set(2, "two");
lruCache.Set(3, "three");

// Accessing an existing item
Console.WriteLine(lruCache.Get(2)); // Outputs: two

// Adding a new item, which exceeds the cache capacity, so the least recently used item (1, "one") is removed
lruCache.Set(4, "four");

try
{
Console.WriteLine(lruCache.Get(1)); // Throws KeyNotFoundException because (1, "one") was evicted
}
catch (KeyNotFoundException)
{
Console.WriteLine("Key 1 not found, as expected.");
}

Conclusion

The LRU Cache is a powerful data structure when dealing with limited memory resources. By combining a `Dictionary` and a `LinkedList`, we can efficiently manage cache entries with O(1) time complexity for both `Get` and `Set` operations. This implementation can be particularly useful in scenarios where data is frequently accessed but needs to be kept within a manageable size, such as in web caching, session management, or memory-sensitive applications.

Understanding and implementing such data structures not only enhances your coding skills but also prepares you for solving complex problems in software development. The LRU Cache is a prime example of how efficient data management techniques can lead to significant performance improvements in real-world applications.

Feel free to integrate this LRU Cache into your projects and modify it to suit your specific needs!

--

--

Çağlar Can SARIKAYA
Çağlar Can SARIKAYA

No responses yet