Data Caching

The cache function allows you to cache the results of data fetching or computation, Compared to full-page rendering cache, it provides more fine-grained control over data granularity and is applicable to various scenarios such as Client-Side Rendering (CSR), Server-Side Rendering (SSR), and API services (BFF).

INFO

X.65.5 and above versions are required

Basic Usage

NOTE

If you use the cache function in BFF, you should import the cache funtion from @modern-js/server-runtime/cache

import { cache } from '@modern-js/server-runtime/cache'

import { cache } from '@modern-js/runtime/cache';
import { fetchUserData } from './api';

const getUser = cache(fetchUserData);

const loader = async () => {
  const user = await getUser(user); // When the parameters of the function changes, the function will be re-executed
  return {
    user,
  };
};

Parameters

  • fn: The data fetching or computation function to be cached
  • options (optional): Cache configuration
    • tag: Tag to identify the cache, which can be used to invalidate the cache
    • maxAge: Cache validity period (milliseconds)
    • revalidate: Time window for revalidating the cache (milliseconds), similar to HTTP Cache-Control's stale-while-revalidate functionality
    • getKey: Simplified cache key generation function based on function parameters
    • customKey: Custom cache key function

The type of the options parameter is as follows:

interface CacheOptions {
  tag?: string | string[];
  maxAge?: number;
  revalidate?: number;
  getKey?: <Args extends any[]>(...args: Args) => string;
  customKey?: <Args extends any[]>(options: {
    params: Args;
    fn: (...args: Args) => any;
    generatedKey: string;
  }) => string | symbol;
}

Return Value

The cache function returns a new function with caching capabilities. Multiple calls to this function will not re-execute the fn function.

Usage Scope

Unlike React's cache function which can only be used in server components, EdenX's cache function can be used in any frontend or server-side code.

Detailed Usage

Without options Parameter

When no options parameter is provided, it's primarily useful in SSR projects, the cache lifecycle is limited to a single SSR rendering request. For example, when the same cachedFn is called in multiple data loaders, the cachedFn function won't be executed repeatedly. This allows data sharing between different data loaders while avoiding duplicate requests. EdenX will re-execute the fn function with each server request.

INFO

Without the options parameter, it can be considered a replacement for React's cache function and can be used in any server-side code (such as in data loaders of SSR projects), not limited to server components.

import { cache } from '@modern-js/runtime/cache';
import { fetchUserData } from './api';

const getUser = cache(fetchUserData);

const loader = async () => {
  const user = await getUser();
  return {
    user,
  };
};

With options Parameter

maxAge Parameter

After each computation, the framework records the time when the cache is written. When the function is called again, it checks if the cache has expired based on the maxAge parameter. If expired, the fn function is re-executed; otherwise, the cached data is returned.

import { cache, CacheTime } from '@modern-js/runtime/cache';

const getDashboardStats = cache(
  async () => {
    return await fetchComplexStatistics();
  },
  {
    maxAge: CacheTime.MINUTE * 2,  // Calling this function within 2 minutes will return cached data
  }
);

revalidate Parameter

The revalidate parameter sets a time window for revalidating the cache after it expires. It can be used together with the maxAge parameter, similar to HTTP Cache-Control's stale-while-revalidate mode.

In the following example, within the 2-minute period before the cache expires, calls to getDashboardStats will return cached data. If the cache has expired (between 2 and 3 minutes), requests will first return the old data, then refresh the data in the background and update the cache.

import { cache, CacheTime } from '@modern-js/runtime/cache';

const getDashboardStats = cache(
  async () => {
    return await fetchComplexStatistics();
  },
  {
    maxAge: CacheTime.MINUTE * 2,
    revalidate: CacheTime.MINUTE * 1,
  }
);

tag Parameter

The tag parameter identifies the cache with a tag, which can be a string or an array of strings. You can invalidate caches based on this tag, and multiple cache functions can use the same tag.

import { cache, revalidateTag } from '@modern-js/runtime/cache';

const getDashboardStats = cache(
  async () => {
    return await fetchDashboardStats();
  },
  {
    tag: 'dashboard',
  }
);

const getComplexStatistics = cache(
  async () => {
    return await fetchComplexStatistics();
  },
  {
    tag: 'dashboard',
  }
);

await revalidateTag('dashboard-stats'); // Invalidates the cache for both getDashboardStats and getComplexStatistics functions

getKey Parameter

The getKey parameter simplifies cache key generation, especially useful when you only need to rely on part of the function parameters to differentiate caches. It's a function that receives the same parameters as the original function and returns a string.

Its return value becomes part of the final cache key, but the key is still combined with a unique function identifier, making the cache function-scoped.

import { cache, CacheTime } from '@modern-js/runtime/cache';
import { fetchUserData } from './api';

const getUser = cache(
  async (userId, options) => {
    // Here options might contain many configurations, but we only want to cache based on userId
    return await fetchUserData(userId, options);
  },
  {
    maxAge: CacheTime.MINUTE * 5,
    // Only use the first parameter (userId) as the cache key
    getKey: (userId, options) => userId,
  }
);

// The following two calls will share the cache because getKey only uses userId
await getUser(123, { language: 'en' });
await getUser(123, { language: 'fr' }); // Cache hit, won't request again

// Different userId will use different cache
await getUser(456, { language: 'en' }); // Won't hit cache, will request again

You can also use Modern.js's generateKey function together with getKey to generate the cache key:

INFO

The generateKey function in Modern.js ensures that a consistent and unique key is generated even if object property orders change, guaranteeing stable caching.

import { cache, CacheTime, generateKey } from '@modern-js/runtime/cache';
import { fetchUserData } from './api';

const getUser = cache(
  async (userId, options) => {
    return await fetchUserData(userId, options);
  },
  {
    maxAge: CacheTime.MINUTE * 5,
    getKey: (userId, options) => generateKey(userId),
  }
);

Additionally, getKey can also return a numeric type as a cache key:

const getUserById = cache(
  fetchUserDataById,
  {
    maxAge: CacheTime.MINUTE * 5,
    // Directly use the numeric ID as the cache key
    getKey: (id) => id,
  }
);

await getUserById(42); // Uses 42 as the cache key

customKey parameter

The customKey parameter is used to fully customize the cache key. It is a function that receives an object with the following properties and returns a string as the cache key.

Its return value directly becomes the final cache key, overriding the default combination of a function identifier and parameter-based key. This allows you to create globally unique keys and share cache across different functions.

  • params: Array of arguments passed to the cached function
  • fn: Reference to the original function being cached
  • generatedKey: Cache key automatically generated by the framework based on input parameters
INFO

Generally, the cache will be invalidated in the following scenarios:

  1. The function's input parameters change
  2. The maxAge condition is no longer satisfied
  3. The revalidateTag method has been called

By default, the framework generates a stable function ID based on the function's string representation and combines it with the generated parameter key. customKey can be used when you need complete control over the cache key generation, especially useful for sharing cache across different function instances. If you just need to customize how parameters are converted to cache keys, it is recommended to use getKey.

This is very useful in some scenarios, such as when you want to share cache across different function instances or when you need predictable cache keys for external cache management.

import { cache } from '@modern-js/runtime/cache';
import { fetchUserData } from './api';

// Different function references, but share the same cache via customKey
const getUserA = cache(
  fetchUserData,
  {
    maxAge: CacheTime.MINUTE * 5,
    customKey: ({ params }) => {
      // Return a stable string as the cache key
      return `user-${params[0]}`;
    },
  }
);

// Even if the function reference changes,
// as long as customKey returns the same value, the cache will be hit
const getUserB = cache(
  (...args) => fetchUserData(...args), // New function reference
  {
    maxAge: CacheTime.MINUTE * 5,
    customKey: ({ params }) => {
      // Return the same key as getUserA
      return `user-${params[0]}`;
    },
  }
);

// Now you can share cache across different function implementations
await getUserA(123); // Fetches data and caches with key "user-123"
await getUserB(123); // Cache hit, returns cached data

// You can utilize the generatedKey parameter to modify the default key
const getUserC = cache(
  fetchUserData,
  {
    customKey: ({ generatedKey }) => `prefix-${generatedKey}`,
  }
);

// For predictable cache keys that can be managed externally
const getUserD = cache(
  async (userId: string) => {
    return await fetchUserData(userId);
  },
  {
    maxAge: CacheTime.MINUTE * 5,
    customKey: ({ params }) => `app:user:${params[0]}`,
  }
);

onCache Parameter

The onCache parameter allows you to track cache statistics such as hit rate. It's a callback function that receives information about each cache operation, including the status, key, parameters, and result.

import { cache, CacheTime } from '@modern-js/runtime/cache';

// Track cache statistics
const stats = {
  total: 0,
  hits: 0,
  misses: 0,
  stales: 0,
  hitRate: () => stats.hits / stats.total
};

const getUser = cache(
  fetchUserData,
  {
    maxAge: CacheTime.MINUTE * 5,
    onCache({ status, key, params, result }) {
      // status can be 'hit', 'miss', or 'stale'
      stats.total++;

      if (status === 'hit') {
        stats.hits++;
      } else if (status === 'miss') {
        stats.misses++;
      } else if (status === 'stale') {
        stats.stales++;
      }

      console.log(`Cache ${status} for key: ${String(key)}`);
      console.log(`Current hit rate: ${stats.hitRate() * 100}%`);
    }
  }
);

// Usage example
await getUser(1); // Cache miss
await getUser(1); // Cache hit
await getUser(2); // Cache miss

The onCache callback receives an object with the following properties:

  • status: The cache operation status, which can be:
    • hit: Cache hit, returning cached content
    • miss: Cache miss, executing the function and caching the result
    • stale: Cache hit but data is stale, returning cached content while revalidating in the background
  • key: The cache key, which is either the result of customKey or the default generated key
  • params: The parameters passed to the cached function
  • result: The result data (either from cache or newly computed)

This callback is only invoked when the options parameter is provided. When using the cache function without options, the onCache callback is not called.

The onCache callback is useful for:

  • Monitoring cache performance
  • Calculating hit rates
  • Logging cache operations
  • Implementing custom metrics

Storage

Default Storage

Currently, both client and server caches are stored in memory. The default storage limit for all cached functions is 1GB. When this limit is reached, the oldest cache is removed using an LRU algorithm.

INFO

Considering that the results of cache function caching are not large, they are currently stored in memory by default.

You can specify the storage limit using the configureCache function:

import { configureCache, CacheSize } from '@modern-js/runtime/cache';

configureCache({
  maxSize: CacheSize.MB * 10, // 10MB
});

Custom Storage Container

In addition to the default memory storage, you can use custom storage containers such as Redis, file systems, databases, etc. This enables cache sharing across processes and servers.

Container Interface

Custom storage containers need to implement the Container interface:

interface Container {
  get: (key: string) => Promise<string | undefined | null>;
  set: (key: string, value: string, options?: { ttl?: number }) => Promise<any>;
  has: (key: string) => Promise<boolean>;
  delete: (key: string) => Promise<boolean>;
  clear: () => Promise<void>;
}
Basic Usage
import { configureCache } from '@modern-js/runtime/cache';

// Use custom storage container
configureCache({
  container: customContainer,
});
UsingcustomKey to Ensure Cache Key Stability
NOTE

When using custom storage containers (such as Redis), it's recommended to configure customKey to ensure cache key stability. This ensures:

  1. Cross-process sharing: Different server instances can share the same cache
  2. Cache validity after application restart: Cache remains valid after restarting the application
  3. Cache persistence after code deployment: Cache for the same logic remains effective after code updates

The default cache key generation mechanism is based on function references, which may not be stable enough in distributed environments. It's recommended to use customKey to provide stable cache keys:

import { cache, configureCache } from '@modern-js/runtime/cache';

// Configure Redis container
configureCache({
  container: redisContainer,
});

// Recommended: Use customKey to ensure key stability
const getUser = cache(
  async (userId: string) => {
    return await fetchUserData(userId);
  },
  {
    maxAge: CacheTime.MINUTE * 5,
    // Use stable identifiers related to the cached function as cache keys
    customKey: () => `fetchUserData`,
  }
);
Redis Storage Example

Here's an example of using Redis as a storage backend:

import { Redis } from 'ioredis';
import { Container, configureCache } from '@modern-js/runtime/cache';

class RedisContainer implements Container {
  private client: Redis;

  constructor(client: Redis) {
    this.client = client;
  }

  async get(key: string): Promise<string | null> {
    return this.client.get(key);
  }

  async set(
    key: string,
    value: string,
    options?: { ttl?: number },
  ): Promise<'OK'> {
    if (options?.ttl) {
      return this.client.set(key, value, 'EX', options.ttl);
    }
    return this.client.set(key, value);
  }

  async has(key: string): Promise<boolean> {
    const result = await this.client.exists(key);
    return result === 1;
  }

  async delete(key: string): Promise<boolean> {
    const result = await this.client.del(key);
    return result > 0;
  }

  async clear(): Promise<void> {
    // Be cautious with this in production. It will clear the entire Redis database.
    // A more robust implementation might use a key prefix and delete keys matching that prefix.
    await this.client.flushdb();
  }
}

// Configure Redis storage
const redisClient = new Redis({
  host: 'localhost',
  port: 6379,
});

configureCache({
  container: new RedisContainer(redisClient),
});
Important Notes
  1. Serialization: All cached data will be serialized to strings for storage. The container only needs to handle string get/set operations.

  2. TTL Support: If your storage backend supports TTL (Time To Live), you can use the options.ttl parameter in the set method (in seconds).