Skip to main content
Build the future with Agentforce at TDX in San Francisco or on Salesforce+ on March 5–6. Register now.

Cache with B2C Commerce SDK

Learning Objectives

After completing this unit you will be able to:

  • List types of cache supported by Commerce SDK.
  • Explain how the least recently used eviction policy works.
  • Describe how max-age works with public and private responses.
  • List three weaknesses when using distributed cache for large-scale applications.
  • State which caching implementations Commerce SDK supports.

Why Caching?

B2C Commerce applications serve a lot of static content, some of which doesn’t change very often. It makes sense that caching this content can dramatically improve an application’s performance. That’s great news for Vijay Lahiry, Cloud Kicks developer. Adding caching to an application might not feel like a difficult task to him, but he knows how easy it is to miss little things that can have serious performance and security implications. He’s glad to know that Commerce SDK has caching built in.

The Commerce SDK supports two cache implementations: In-memory and Redis. He takes a look at the Implement a CDN and Caching unit in the Headless Implementation Strategies for Salesforce B2C Commerce module for details. 

Both implementations check for a cached response prior to initiating an API call. This reduces the number of API calls made by the SDK to improve application performance.

Both implementations respect standard cache headers. The Cache-Control HTTP header, included in the response by the server, determines what responses are cached and how long they are valid. The client can set the Cache-Control header in requests to modify the behavior set by the server.

In-Memory Cache

Vijay wants to use as many out-of-the-box capabilities as possible. That means he implements the in-memory caching of responses for most situations. It’s enabled by default. But this method isn’t one-size-fits-all. The cache lives in memory. This means it’s not distributed and is only accessible to the web application hosted on that particular machine. If it doesn't make sense to do caching, for whatever reason, Vijay can disable caching altogether. He simply sets cacheManager to null as shown here: config.cacheManager = null;

Redis Cache

Redis cache is the Salesforce recommended approach. To use a Redis cache, Vijay instantiates a CacheManagerRedis object with a Redis URL and adds it to his client config object, as shown here.

import { CacheManagerRedis } from '@commerce-apps/core';
// To import everything, use:
// import * as CommerceSdkCore from "@commerce-apps/core";
// const { CacheManagerRedis } = CommerceSdkCore;
// In CommonJS:
// const { CacheManagerRedis } = require("@commerce-apps/core");
const cacheManager = new CacheManagerRedis({
connection: 'redis://localhost:6379',
});
config.cacheManager = cacheManager;

Alternative Cache Implementations

If Vijay wants to use another type of cache, he can write his own CacheManager implementation and use it for some or all of his caching operations.

Some data isn’t appropriate for caching. For example, Commerce SDK doesn’t cache sensitive information such as authorization tokens.

What’s the Default Behavior

Commerce SDK, by default, uses an in-memory cache with a limit of 10,000 items. It has a least recently used eviction policy. This means that it throws away the least recently used response when it reaches the limit. Its caching behavior is determined by standard cache headers. It doesn’t cache responses marked as private, and strips authorization headers before writing the response to the cache.

Here’s how things work when Vijay makes a call for product details using the Commerce SDK.

// Create a new ShopperProduct API client
const productClient = new Product.ShopperProducts(config);
// Get product details
const details = await productClient.getProduct({
parameters: {
id: "25591139M"
}
});

When Vijay inspects the Cache-Control header of the response, he sees this:

public, must-revalidate, max-age=60

The responses are either marked as public or private.

  • Public responses should not contain information that’s specific to a requester and could be shared.
  • Private responses should not be cached anywhere except the user’s browser.

The response in this example is public, so it’s written to cache. As time passes, max-age (60 seconds) comes into play. If a client makes the same request within 60 seconds, it uses the response from cache without making a request to the server. If the client tries the same request after 60 seconds has passed, it doesn’t use the cached response and the request continues to the server.

Distributed Caching with Redis

Vijay realizes that this simple default cache implementation is, well… simple. It might be adequate for development purposes or a single instance application, but for a full-blown ecommerce solution… not so much.

When he considers what he envisions for the Cloud Kicks storefront application, its weaknesses begin to show. If he ran his application as a set of auto scaling containers, they would each have to maintain their own separate caches and manage shared memory between the cache and the application. What’s worse, the caches would disappear anytime a container restarted.

The Commerce SDK offers a better solution via Redis. Commerce SDK comes ready to connect to Redis by supplying the Redis connection string, as above, as well as configuration options supported by Keyv. This allows him to scale out his application horizontally while benefiting from a shared cache.

Bring Your Own Caching Solution

The caching options Vijay explored are implementations of the Cache API that’s documented on the Mozilla developer site (see Resources). Commerce SDK accepts any implementation of that interface. You can import a TypeScript definition of the interface with this code.

import { ICacheManager } from "@commerce-apps/core/dist/base/cacheManager";
export class CacheManagerCustom<T> implements ICacheManager { ... }

Before Vijay creates an entire implementation from scratch, he breaks the cache manager into its two fundamental pieces: how to cache and where to cache.

The Commerce SDK cache manager implementation conveniently separates the two pieces via the pluggable Keyv storage interface. While Salesforce only tested quick-lru and Redis in the Commerce SDK, Vija can use other supported Keyv backends. He chooses Memcached with the script he used earlier.

Here’s how he does it.

  1. Get Docker (see Resources).
  2. Set up Docker with Memcached.
    $ docker run -p11211:11211 --name memcached -d memcached
  3. Install the Keyv Memcache package (see Resources).
    $ npm install --save keyv-memcache
  4. Configure memcached in the Commerce SDK script.
    import { CacheManagerKeyv } from '@commerce-apps/core';
    const KeyvMemcache = require('keyv-memcache');
    const memcache = new KeyvMemcache('user:pass@localhost:11211');
    const config = {
    cacheManager: new CacheManagerKeyv({ keyvStore: memcache }),
     ...

When Vijay runs his script again, the output looks almost the same. But when he looks in memcached, he sees that the response was cached.

$ docker exec -it --user root memcached bash
...# apt-get update && apt-get install -y libmemcached-tools
...# memcdump --servers=localhost
keyv:keyv:request-cache-metadata:https://shortcode.api.commercecloud.salesforce.com/product/shopper-products/v1/organizations/orgid/products/25591139M?siteId=RefArch
keyv:keyv:request-cache:https://shortcode.api.commercecloud.salesforce.com/product/shopper-products/v1/organizations/orgid/products/25591139M?siteId=RefArch

Let’s Sum It Up

In this unit, you saw how simple it was for Vijay to scale Commerce SDK caching features when implementing headless via Commerce APIs. In this module, you saw Vijay explore Commerce SDK in detail and learned how to apply it in your own application. Now take the final test and earn a badge.

Resources

在 Salesforce 帮助中分享 Trailhead 反馈

我们很想听听您使用 Trailhead 的经验——您现在可以随时从 Salesforce 帮助网站访问新的反馈表单。

了解更多 继续分享反馈