Caching on Vercel's Edge Network
Vercel's Edge Network caches your content at the edge in order to serve data to your users as fast as possible. Learn how Vercel caches works in this guide.Vercel's Edge Network caches your content at the edge in order to serve data to your users as fast as possible. Vercel's caching is available for all deployments and domains on your account, regardless of the pricing plan.
Vercel uses the Edge Network to cache your content globally and serve data to your users as quickly as possible. There are two ways to cache content:
- Static file caching is automatic for all deployments, requiring no manual configuration
- To cache dynamic content, including SSR content, you can use
Cache-Control
headers
You can cache responses on Vercel with Cache-Control
headers defined in:
- Responses from Vercel Functions
- Route definitions in
vercel.json
ornext.config.js
You can use any combination of the above options, but if you return Cache-Control
headers in a Vercel Function, it will override the headers defined for the same route in vercel.json
or next.config.js
.
To cache the response of Functions on Vercel's Edge Network, you must include Cache-Control
headers with any of the following directives:
s-maxage=N
s-maxage=N, stale-while-revalidate=Z
proxy-revalidate
and
stale-if-error
are not currently supported.
The following example demonstrates a function that caches its response and revalidates it every 1 second:
export async function GET() {
return new Response('Cache Control example', {
status: 200,
headers: {
'Cache-Control': 'public, s-maxage=1',
'CDN-Cache-Control': 'public, s-maxage=60',
'Vercel-CDN-Cache-Control': 'public, s-maxage=3600',
},
});
}
For direct control over caching on Vercel and downstream CDNs, you can use CDN-Cache-Control headers.
You can define route headers in vercel.json
or next.config.js
files. These headers will be overridden by headers defined in Function responses.
The following example demonstrates a vercel.json
file that adds Cache-Control
headers to a route:
{
"headers": [
{
"source": "/about.js",
"headers": [
{
"key": "Cache-Control",
"value": "s-maxage=1, stale-while-revalidate=59"
}
]
}
]
}
If you're building your app with Next.js, you should use next.config.js
rather than vercel.json
. The following example demonstrates a next.config.js
file that adds Cache-Control
headers to a route:
/** @type {import('next').NextConfig} */
const nextConfig = {
reactStrictMode: true,
async headers() {
return [
{
source: '/about',
headers: [
{
key: 'Cache-Control',
value: 's-maxage=1, stale-while-revalidate=59',
},
],
},
];
},
};
module.exports = nextConfig;
See the Next docs to learn more about next.config.js
.
Static files are automatically cached at the edge on Vercel's Edge Network for the lifetime of the deployment after the first request.
- If a static file is unchanged, the cached value can persist across deployments due to the hash used in the filename
- Optimized images cached will persist between deployments
max-age=N, public
max-age=N, immutable
Where N
is the number of seconds the response should be cached. The response must also meet the caching criteria.
You can cache dynamic content through Vercel Functions, including SSR, by adding Cache-Control
headers to your response. When you specify Cache-Control
headers in a function, responses will be cached in the region the function was requested from.
See our docs on Cache-Control headers to learn how to best use Cache-Control
directives on Vercel's Edge Network.
Vercel supports two Targeted Cache-Control headers:
CDN-Cache-Control
, which allows you to control the Vercel Edge Cache or other CDN cache separately from the browser's cache. The browser will not be affected by this headerVercel-CDN-Cache-Control
, which allows you to specifically control Vercel's Edge Cache. Neither other CDNs nor the browser will be affected by this header
By default, the headers returned to the browser are as follows:
Cache-Control
CDN-Cache-Control
Vercel-CDN-Cache-Control
headers are not returned to the browser or forwarded to other CDNs.
To learn how these headers work in detail, see our dedicated headers docs.
The following example demonstrates Cache-Control
headers that instruct:
- Vercel's Edge Cache to have a TTL of
3600
seconds - Downstream CDNs to have a TTL of
60
seconds - Clients to have a TTL of
10
seconds
export async function GET() {
return new Response('Cache Control example', {
status: 200,
headers: {
'Cache-Control': 'max-age=10',
'CDN-Cache-Control': 'max-age=60',
'Vercel-CDN-Cache-Control': 'max-age=3600',
},
});
}
If you set Cache-Control
without a CDN-Cache-Control
, the Vercel Edge Network strips s-maxage
and stale-while-revalidate
from the response before sending it to the browser. To determine if the response was served from the cache, check the x-vercel-cache
header in the response.
The Vary
response header instructs caches to use specific request headers as part of the cache key. This allows you to serve different cached responses to different users based on their request headers.
The Vary
header only has an effect when used in
combination with Cache-Control
headers that enable
caching (such as s-maxage
). Without a caching
directive, the Vary
header has no behavior.
When Vercel's Edge Network receives a request, it combines the cache key (described in the Cache Invalidation section) with the values of any request headers specified in the Vary
header to create a unique cache entry for each distinct combination.
Vercel's Edge Network already includes the Accept
and
Accept-Encoding
headers as part of the cache key by
default. You do not need to explicitly include these headers in your
Vary
header.
The most common use case for the Vary
header is content negotiation, serving different content based on:
- User location (e.g.,
X-Vercel-IP-Country
) - Device type (e.g.,
User-Agent
) - Language preferences (e.g.,
Accept-Language
)
Example: Country-specific content
You can use the Vary
header with Vercel's X-Vercel-IP-Country
request header to cache different responses for users from different countries:
import { type NextRequest } from 'next/server';
export async function GET(request: NextRequest) {
const country = request.headers.get('x-vercel-ip-country') || 'unknown';
// Serve different content based on country
let content;
if (country === 'US') {
content = { message: 'Hello from the United States!' };
} else if (country === 'GB') {
content = { message: 'Hello from the United Kingdom!' };
} else {
content = { message: `Hello from ${country}!` };
}
return Response.json(content, {
status: 200,
headers: {
'Cache-Control': 's-maxage=3600',
Vary: 'X-Vercel-IP-Country',
},
});
}
You can set the Vary
header in the same ways you set other response headers:
In Vercel Functions
import { type NextRequest } from 'next/server';
export async function GET(request: NextRequest) {
return Response.json(
{ data: 'This response varies by country' },
{
status: 200,
headers: {
Vary: 'X-Vercel-IP-Country',
'Cache-Control': 's-maxage=3600',
},
},
);
}
Using vercel.json
{
"headers": [
{
"source": "/api/data",
"headers": [
{
"key": "Vary",
"value": "X-Vercel-IP-Country"
},
{
"key": "Cache-Control",
"value": "s-maxage=3600"
}
]
}
]
}
Using next.config.js
If you're building your app with Next.js, use next.config.js
:
/** @type {import('next').NextConfig} */
const nextConfig = {
async headers() {
return [
{
source: '/api/data',
headers: [
{
key: 'Vary',
value: 'X-Vercel-IP-Country',
},
{
key: 'Cache-Control',
value: 's-maxage=3600',
},
],
},
];
},
};
module.exports = nextConfig;
You can specify multiple headers in a single Vary
value by separating them with commas:
res.setHeader('Vary', 'X-Vercel-IP-Country, Accept-Language');
This will create separate cache entries for each unique combination of country and language preference.
- Use
Vary
headers selectively, as each additional header exponentially increases the number of cache entries — this doesn't directly impact your bill, but can result in more cache misses than desired - Only include headers that meaningfully impact content generation
- Consider combining multiple variations into a single header value when possible
The Cache-Control
field is an HTTP header specifying caching rules for client (browser) requests and server responses. A cache must obey the requirements defined in the Cache-Control
header.
For server responses to be successfully cached with Vercel's Edge Network, the following criteria must be met:
- Request uses
GET
orHEAD
method. - Request does not contain
Range
header. - Request does not contain
Authorization
header. - Response uses
200
,404
,301
,302
,307
or308
status code. - Response does not exceed
10MB
in content length. - Response does not contain the
set-cookie
header. - Response does not contain the
private
,no-cache
orno-store
directives in theCache-Control
header. - Response does not contain
Vary: *
header, which is treated as equivalent toCache-Control: private
.
Vercel does not allow bypassing the cache for static files by design.
Every deployment has a unique key used for caching based on the deployment URL created at build time. This key ensures that users never see content from a previous deployment. It contains the following information:
- The request method (such as
GET
,POST
, etc) - The request URL (query strings are ignored for static files)
- The host domain
- The unique deployment URL
- The scheme (whether it's
https
orhttp
) - The
accept
header (Image Optimization requests only)
The cache is automatically purged upon a new deployment being created. If you ever need to invalidate Vercel's Edge Network cache, you can always re-deploy.
The x-vercel-cache
header is included in HTTP responses to the client, and describes the state of the cache.
See our headers docs to learn more.
Vercel's Edge Network cache is segmented by region. The following caching limits apply to Vercel Function responses:
- Max cacheable response size:
- Streaming functions: 20MB
- Non-streaming functions: 10MB
- Max cache time: 1 year
s-maxage
max-age
stale-while-revalidate
While you can put the maximum time for server-side caching, cache times are best-effort and not guaranteed. If an asset is requested often, it is more likely to live the entire duration. If your asset is rarely requested (e.g. once a day), it may be evicted from the regional cache.
Vercel does not currently support using proxy-revalidate
and stale-if-error
for server-side caching.
Was this helpful?