LRU Redux

JavaScriptReactReduxCache LibraryLRUState Management

GitHub Overview

SamSaffron/lru_redux

An efficient optionally thread safe LRU Cache

Stars286
Watchers4
Forks20
Created:April 23, 2013
Language:Ruby
License:MIT License

Topics

None

Star History

SamSaffron/lru_redux Star History
Data as of: 10/22/2025, 08:07 AM

Cache Library

LRU Redux

Overview

LRU Redux is a library concept that implements LRU (Least Recently Used) cache strategies for Redux applications.

Details

LRU Redux represents a design concept and library ecosystem that integrates LRU (Least Recently Used) cache strategies into Redux state management patterns. In Redux applications, it efficiently caches API responses, computation results, and component states, reducing unnecessary API calls and re-renders. By combining with Reselect library's memoized selectors, it reduces the computational cost of derived data and efficiently retrieves data from the Redux store. The LRU algorithm automatically removes old data while prioritizing recently accessed data and limiting memory usage. When implemented as Redux middleware, it provides automatic caching functionality during action dispatching, TTL (Time To Live) expiration management, and cache invalidation strategies. In React applications, it significantly contributes to optimizing component re-renders, preventing duplicate API data fetching, and improving user experience. The library supports TypeScript for type-safe implementations and integrates with Redux DevTools for cache state visualization, making debugging and monitoring cache behavior more transparent for developers.

Pros and Cons

Pros

  • Performance Improvement: Reduces unnecessary API calls and re-renders
  • Redux Integration: Natural integration with existing Redux architecture
  • Memory Efficiency: Automatic memory management via LRU algorithm
  • Developer Experience: Transparent caching functionality
  • Flexible Configuration: Customizable TTL, size limits, and invalidation strategies
  • TypeScript Support: Type-safe cache implementation
  • Debug Support: Cache state visualization with Redux DevTools

Cons

  • Complexity: Complicates state management and cache logic
  • Memory Usage: Additional memory consumption from in-memory cache
  • Debug Difficulty: Cache-related bugs are hard to discover and fix
  • Consistency Challenges: Complex data consistency management
  • Learning Curve: Requires understanding both Redux and cache concepts
  • Over-Engineering: May be excessive for small-scale applications

Key Links

Code Examples

Implementation as Redux Middleware

import { LRUCache } from 'lru-cache'
import { createSlice, configureStore } from '@reduxjs/toolkit'

// LRU cache configuration
const cache = new LRUCache({
  max: 1000,
  ttl: 1000 * 60 * 5 // 5 minutes
})

// Cache middleware
const cacheMiddleware = (store) => (next) => (action) => {
  // Determine if action should be cached
  if (action.type.endsWith('/fulfilled') && action.meta?.requestId) {
    const cacheKey = `${action.type}-${JSON.stringify(action.meta.arg)}`
    cache.set(cacheKey, action.payload)
  }

  // Try to retrieve from cache
  if (action.type.endsWith('/pending')) {
    const cacheKey = `${action.type.replace('/pending', '/fulfilled')}-${JSON.stringify(action.meta.arg)}`
    const cachedData = cache.get(cacheKey)
    
    if (cachedData) {
      // On cache hit, directly dispatch fulfilled action
      return next({
        type: action.type.replace('/pending', '/fulfilled'),
        payload: cachedData,
        meta: { ...action.meta, cached: true }
      })
    }
  }

  return next(action)
}

// Redux store configuration
const store = configureStore({
  reducer: {
    api: apiSlice.reducer
  },
  middleware: (getDefaultMiddleware) =>
    getDefaultMiddleware().concat(cacheMiddleware)
})

Cache Management in Redux Slice

import { createSlice, createAsyncThunk } from '@reduxjs/toolkit'
import { LRUCache } from 'lru-cache'

// Global cache
const dataCache = new LRUCache({
  max: 500,
  ttl: 1000 * 60 * 10 // 10 minutes
})

// Async action with cache integration
export const fetchUserData = createAsyncThunk(
  'users/fetchById',
  async (userId, { rejectWithValue }) => {
    const cacheKey = `user-${userId}`
    
    // Check cache
    const cached = dataCache.get(cacheKey)
    if (cached) {
      return { ...cached, fromCache: true }
    }

    try {
      const response = await fetch(`/api/users/${userId}`)
      const userData = await response.json()
      
      // Save to cache
      dataCache.set(cacheKey, userData)
      return userData
    } catch (error) {
      return rejectWithValue(error.message)
    }
  }
)

// Redux slice
const usersSlice = createSlice({
  name: 'users',
  initialState: {
    entities: {},
    loading: false,
    error: null,
    cacheStats: {
      hits: 0,
      misses: 0
    }
  },
  reducers: {
    clearUserCache: (state, action) => {
      if (action.payload) {
        dataCache.delete(`user-${action.payload}`)
      } else {
        dataCache.clear()
      }
    },
    updateCacheStats: (state) => {
      state.cacheStats = {
        size: dataCache.size,
        calculatedSize: dataCache.calculatedSize
      }
    }
  },
  extraReducers: (builder) => {
    builder
      .addCase(fetchUserData.fulfilled, (state, action) => {
        state.entities[action.meta.arg] = action.payload
        state.loading = false
        
        // Update cache statistics
        if (action.payload.fromCache) {
          state.cacheStats.hits++
        } else {
          state.cacheStats.misses++
        }
      })
  }
})

export const { clearUserCache, updateCacheStats } = usersSlice.actions
export default usersSlice.reducer

Combining Reselect with LRU Cache

import { createSelector } from 'reselect'
import { LRUCache } from 'lru-cache'

// Computation results cache
const computationCache = new LRUCache({
  max: 200,
  ttl: 1000 * 60 * 30 // 30 minutes
})

// Base selectors
const selectUsers = (state) => state.users.entities
const selectProducts = (state) => state.products.entities
const selectOrders = (state) => state.orders.entities

// Complex computation selector with cache
export const selectUserOrderSummary = createSelector(
  [selectUsers, selectProducts, selectOrders],
  (users, products, orders) => {
    const cacheKey = `summary-${JSON.stringify({
      userCount: Object.keys(users).length,
      productCount: Object.keys(products).length,
      orderCount: Object.keys(orders).length
    })}`
    
    // Check cache
    const cached = computationCache.get(cacheKey)
    if (cached) {
      return cached
    }

    // Heavy computation
    const summary = Object.values(orders).reduce((acc, order) => {
      const user = users[order.userId]
      const orderTotal = order.items.reduce((total, item) => {
        const product = products[item.productId]
        return total + (product?.price || 0) * item.quantity
      }, 0)

      acc.totalRevenue += orderTotal
      acc.ordersByUser[user?.name || 'Unknown'] = 
        (acc.ordersByUser[user?.name || 'Unknown'] || 0) + 1

      return acc
    }, {
      totalRevenue: 0,
      ordersByUser: {},
      computedAt: Date.now()
    })

    // Save to cache
    computationCache.set(cacheKey, summary)
    return summary
  }
)

Usage in React Components

import React, { useEffect } from 'react'
import { useSelector, useDispatch } from 'react-redux'
import { fetchUserData, clearUserCache } from './userSlice'

// Cache statistics display component
const CacheStats = () => {
  const cacheStats = useSelector(state => state.users.cacheStats)
  const dispatch = useDispatch()

  useEffect(() => {
    const interval = setInterval(() => {
      dispatch(updateCacheStats())
    }, 5000) // Update every 5 seconds

    return () => clearInterval(interval)
  }, [dispatch])

  return (
    <div className="cache-stats">
      <h3>Cache Statistics</h3>
      <p>Hits: {cacheStats.hits}</p>
      <p>Misses: {cacheStats.misses}</p>
      <p>Cache Size: {cacheStats.size}</p>
      <button onClick={() => dispatch(clearUserCache())}>
        Clear Cache
      </button>
    </div>
  )
}

// User data display component
const UserProfile = ({ userId }) => {
  const dispatch = useDispatch()
  const user = useSelector(state => state.users.entities[userId])
  const loading = useSelector(state => state.users.loading)

  useEffect(() => {
    if (!user) {
      dispatch(fetchUserData(userId))
    }
  }, [dispatch, userId, user])

  if (loading) return <div>Loading...</div>
  if (!user) return <div>User not found</div>

  return (
    <div className="user-profile">
      <h2>{user.name}</h2>
      <p>Email: {user.email}</p>
      {user.fromCache && (
        <span className="cache-indicator">From Cache</span>
      )}
    </div>
  )
}

Integration with RTK Query

import { createApi, fetchBaseQuery } from '@reduxjs/toolkit/query/react'
import { LRUCache } from 'lru-cache'

// Custom cache implementation
const customCache = new LRUCache({
  max: 1000,
  ttl: 1000 * 60 * 15 // 15 minutes
})

// RTK Query API definition
export const apiSlice = createApi({
  reducerPath: 'api',
  baseQuery: fetchBaseQuery({
    baseUrl: '/api/',
    // Add custom cache logic
    prepareHeaders: (headers, { endpoint, arg }) => {
      const cacheKey = `${endpoint}-${JSON.stringify(arg)}`
      const cached = customCache.get(cacheKey)
      
      if (cached) {
        headers.set('X-From-Cache', 'true')
      }
      
      return headers
    }
  }),
  tagTypes: ['User', 'Post'],
  endpoints: (builder) => ({
    getUser: builder.query({
      query: (id) => `users/${id}`,
      providesTags: ['User'],
      // Custom cache integration
      transformResponse: (response, meta, arg) => {
        const cacheKey = `getUser-${arg}`
        customCache.set(cacheKey, response)
        return response
      }
    }),
    getPosts: builder.query({
      query: () => 'posts',
      providesTags: ['Post'],
      // Selective cache invalidation
      invalidatesTags: (result, error, arg) => {
        if (error) return []
        return ['Post']
      }
    })
  })
})

export const { useGetUserQuery, useGetPostsQuery } = apiSlice

Advanced Cache Strategy

// Multi-level cache system
class MultiLevelCache {
  constructor() {
    // L1: Fast access (small capacity)
    this.l1Cache = new LRUCache({ max: 100, ttl: 1000 * 60 })
    
    // L2: Normal access (medium capacity)
    this.l2Cache = new LRUCache({ max: 1000, ttl: 1000 * 60 * 10 })
    
    // L3: Long-term storage (large capacity)
    this.l3Cache = new LRUCache({ max: 5000, ttl: 1000 * 60 * 60 })
  }

  get(key) {
    // Check L1 first
    let value = this.l1Cache.get(key)
    if (value) {
      return { value, level: 'L1' }
    }

    // Check L2
    value = this.l2Cache.get(key)
    if (value) {
      // Promote to L1
      this.l1Cache.set(key, value)
      return { value, level: 'L2' }
    }

    // Check L3
    value = this.l3Cache.get(key)
    if (value) {
      // Promote to L2
      this.l2Cache.set(key, value)
      return { value, level: 'L3' }
    }

    return null
  }

  set(key, value, priority = 'normal') {
    switch (priority) {
      case 'high':
        this.l1Cache.set(key, value)
        this.l2Cache.set(key, value)
        break
      case 'normal':
        this.l2Cache.set(key, value)
        break
      case 'low':
        this.l3Cache.set(key, value)
        break
    }
  }

  getStats() {
    return {
      l1: { size: this.l1Cache.size },
      l2: { size: this.l2Cache.size },
      l3: { size: this.l3Cache.size }
    }
  }
}

// Use in Redux middleware
const multiLevelCache = new MultiLevelCache()

const advancedCacheMiddleware = (store) => (next) => (action) => {
  // Priority-based cache strategy
  if (action.type.includes('user/fetch')) {
    const cached = multiLevelCache.get(action.meta.arg.userId)
    if (cached) {
      console.log(`Cache hit from ${cached.level}`)
      // Modify action with cached data
    }
  }

  const result = next(action)

  // Response priority determination and caching
  if (action.type.endsWith('/fulfilled')) {
    const priority = action.meta.critical ? 'high' : 'normal'
    multiLevelCache.set(action.meta.arg, action.payload, priority)
  }

  return result
}