Caching#

Caching responses#

Sometimes it’s desirable to cache some responses, especially if these involve expensive calculations, or when polling is expected. Litestar comes with a simple mechanism for caching:

from litestar import get

@get("/cached", cache=True)
async def my_cached_handler() -> str:
    return "cached"

By setting cache to True, the response from the handler will be cached. If no cache_key_builder is set in the route handler, caching for the route handler will be enabled for the default_expiration.

Note

If the default default_expiration is set to None, setting up the route handler with cache set to True will keep the response in cache indefinitely.

Alternatively you can specify the number of seconds to cache the responses from the given handler like so:

Caching the response for 120 seconds by setting the cache parameter to the number of seconds to cache the response.#
from litestar import get

@get("/cached-seconds", cache=120)  # seconds
async def my_cached_handler_seconds() -> str:
    return "cached for 120 seconds"

If you want the response to be cached indefinitely, you can pass the CACHE_FOREVER sentinel instead:

Caching the response indefinitely by setting the cache parameter to CACHE_FOREVER.#
from litestar import get
from litestar.config.response_cache import CACHE_FOREVER


@get("/cached-forever", cache=CACHE_FOREVER)
async def my_cached_handler_forever() -> str:
    return "cached forever"

Configuration#

You can configure caching behaviour on the application level by passing an instance of ResponseCacheConfig to the Litestar instance.

Changing where data is stored#

By default, caching will use the MemoryStore, but it can be configured with any Store, for example RedisStore:

Using Redis as the cache store.#
import asyncio

from litestar import Litestar, get
from litestar.config.response_cache import ResponseCacheConfig
from litestar.stores.redis import RedisStore


@get(cache=10)
async def something() -> str:
    await asyncio.sleep(1)
    return "something"


redis_store = RedisStore.with_client(url="redis://localhost/", port=6379, db=0)
cache_config = ResponseCacheConfig(store="redis_backed_store")
app = Litestar(
    [something],
    stores={"redis_backed_store": redis_store},
    response_cache_config=cache_config,
)

Specifying a cache key builder#

Litestar uses the request’s path + sorted query parameters as the cache key. This can be adjusted by providing a “key builder” function, either at application or route handler level.

Using a custom cache key builder.#
from litestar import Litestar, Request
from litestar.config.response_cache import ResponseCacheConfig


def key_builder(request: Request) -> str:
    return request.url.path + request.headers.get("my-header", "")


app = Litestar([], response_cache_config=ResponseCacheConfig(key_builder=key_builder))
Using a custom cache key builder for a specific route handler.#
from litestar import Litestar, Request, get


def key_builder(request: Request) -> str:
    return request.url.path + request.headers.get("my-header", "")


@get("/cached-path", cache=True, cache_key_builder=key_builder)
async def cached_handler() -> str:
    return "cached"


app = Litestar([cached_handler])