Skip to main content

Caching Infrastructure

Overview

Redis-backed HTTP response caching to improve API performance and reduce database load.

Architecture

flowchart LR
A[Request] --> B[Cache Interceptor]
B --> C{Check Redis}
C -->|Hit| D[Return cached]
C -->|Miss| E[Fetch fresh]
E --> F[Store in Redis]
F --> D

Stack

  • cache-manager v7 + @keyv/redis — Keyv-backed Redis store
  • Replaced the older cache-manager-redis-yet adapter in Feb 2026

Components

  • CacheModule: Global Redis cache configuration
  • HttpCacheInterceptor: Base interceptor for caching GET requests
  • ShortCacheInterceptor: 30-second TTL for frequently changing data
  • MediumCacheInterceptor: 1-minute TTL for semi-static data

Configuration

Environment variables in .env:

REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_PASSWORD= # Optional

Usage

Basic Caching

@Controller('items')
export class ItemController {
@Get()
@UseInterceptors(MediumCacheInterceptor)
findAll() {
return this.itemService.findAll();
}
}

Cache Control

Bypass cache with header:

curl -H "X-No-Cache: true" http://localhost:3001/api/v1/guilds

Check cache hit/miss:

# Response headers include:
X-Cache: HIT # Served from cache
X-Cache: MISS # Fetched from database

Cache Invalidation

Currently, caches auto-expire based on TTL. For manual invalidation:

@Inject(CACHE_MANAGER) private cacheManager: Cache;

// Clear specific key
await this.cacheManager.del('http:/guilds/123');

// Clear all caches
await this.cacheManager.reset();

Performance Impact

Before caching:

  • Guild roster query: ~150-300ms (complex joins)
  • Character list: ~50-100ms

After caching:

  • Cached responses: ~5-15ms (90-95% reduction)
  • Cache miss overhead: ~2-5ms

Cache Keys

Format: http:{userId}:{request.url}

The cache key includes the user ID to prevent cross-user data pollution. This ensures:

  • Guest users don't receive cached authenticated responses
  • Authenticated users don't receive cached guest responses
  • Each user's permissions are correctly reflected in cached data

Examples:

  • http:guest:/guilds
  • http:user-123:/guilds/456?page=1&pageSize=50
  • http:user-789:/characters?includeInactive=true

Implementation:

// Cache key generation in HttpCacheInterceptor
const userId = request.user?.id || 'guest';
const cacheKey = `http:${userId}:${request.url}`;

Best Practices

  1. Choose appropriate TTL:

    • 30s: Guild rosters (frequently synced)
    • 1min: Character lists, templates
    • 5min: Static data (default)
  2. Don't cache:

    • POST/PUT/PATCH/DELETE requests
    • User-specific sensitive data
    • Real-time data requirements
  3. Monitor cache effectiveness:

    • Check X-Cache headers in responses
    • Monitor Redis memory usage
    • Track cache hit/miss ratios
  4. Frontend cache management:

    • Clear React Query cache on login/logout
    • Use queryClient.clear() to force fresh data
    • Prevents showing stale permission data after auth state changes

Troubleshooting

Cache not working

  1. Check Redis is running: redis-cli ping
  2. Verify env vars are set
  3. Check X-Cache header (should be HIT/MISS)

Stale data

  • Reduce TTL for that endpoint
  • Use X-No-Cache header to force refresh
  • Implement cache invalidation on mutations

Redis connection errors

Error: Redis connection failed

Solution:

# Start Redis
brew services start redis

# Or via Docker
docker run -d -p 6379:6379 redis:alpine