You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Remove the source ./get-sc-creds.sh; part of the build script
Deploy to Vercel
See the first time displayed. Call this Time A.
Click "Revalidate" several times and see new times for each revalidation displayed, after Time A.
Change currHash locally in page.tsx:5 to something different
Commit and push the changes, such that the build re-deploys with build cache
Once the deployment succeeds, see that the time displayed has regressed to Time A.
In the above pictures, see that the two times 3:35:18 PM and 3:35:09 PM, which came from manual revalidation of the time tag, are forgotten after the new deployment using the build cache. (The tabular data is saved in localStorage to persist across deployments).
To Workaround
Clone the repo
Do NOT remove the source ./get-sc-creds.sh; part of the build script
Deploy to Vercel
Set the environment variables outlined in the repo README.md and re-deploy.
See the first time displayed. Call this Time A.
Click "Revalidate" several times and see new times for each revalidation displayed, after Time A.
Change currHash locally in page.tsx:5 to something different
Commit and push the changes, such that the build re-deploys with build cache
Once the deployment succeeds, see that the time displayed has not regressed to Time A, and rather is the most recently-fetched time from the manual revalidation.
Current vs. Expected behavior
Current Behavior:
Deployments using the build cache only use data fetched during the current and previous build processes, not data from the Data Cache. Essentially, manually-revalidated data (from revalidateTag, for example) is forgotten after a new deployment using the build cache.
Expected Behavior:
New deployments using the build cache should use the most up-to-date data from the Vercel Data Cache, not just from previous build processes.
Thoughts on the Issue
I suspect this is because the Build Cache snapshots and uploads the .next/cache folder, which includes the .next/cache/fetch-cache folder (the local fetch cache). When re-using the build cache, then, the .next/cache/fetch-cache downloaded is that from the time of the last build—ignoring any requests manually revalidated during the deployment's lifetime. The Vercel Data Cache does not seem to be used. This means that deployments using the build cache can only ever read data requested during the deployment build process and thus saved to the snapshotted .next/cache/fetch-cache—not data that has been revalidated during the lifecycle of the deployment.
Current workarounds are:
If using manually revalidated data, do not use the build cache. This ensures that requests are always up-to-date at build time, as data regression is far more harmful than longer build times and more requests.
Manually revalidate all requests after a new deployment. This is a terrible solution, though, as it's a pain to manually revalidate all tags and the old data is still visible while those requests are revalidated.
EDIT: The problem is caused by the environment variables SUSPENSE_CACHE_URL, SUSPENSE_CACHE_ENDPOINT, and SUSPENSE_CACHE_AUTH_TOKEN not being set in the deployment pipeline. Without these, the fetch cache is not used, and the build pipeline always resorts to the filesystem cache in .next/cache/fetch-cache from the downloaded Build Cache.
Proposed Solution
(Assuming that the above is what's going on)—don't save the .next/cache/fetch-cache folder in the Build Cache, instead sourcing data from Vercel's Data Cache, where manually-revalidated data is accessible. This separation between build artifacts and fetch data makes more sense to me architecturally, as well.
EDIT: Expose the SUSPENSE_CACHE_URL, SUSPENSE_CACHE_ENDPOINT, and SUSPENSE_CACHE_AUTH_TOKEN environment variables to the build pipeline in the same way that they are set for route handlers in the ctx._requestHeaders passed into the constructor for the CacheHandler.
Workaround
In our build script, we somehow need to get SUSPENSE_CACHE_URL, SUSPENSE_CACHE_ENDPOINT, and SUSPENSE_CACHE_AUTH_TOKEN. These are only easily acquirable on the user-side from an existing FetchCacheCacheHandler. We can get one of these from within a route handler on the global globalThis.__incrementalCache object. So we can populate these environment variables with three things:
A protected route handler that reads the above credentials and endpoint and returns them in its response
A bash script which calls that route handler from a previous deployment and populates the build process's environment variables dynamically
Modifying our build script in package.json to call the bash script.
This works, and once you set it up you don't have to touch it again. But it would be much better if Vercel would do this for us.
evankirkiles
changed the title
Vercel Build Cache overrides Vercel Data Cache, causing data regression
Vercel Build Cache prevents On-Demand Revalidation from persisting across deployments
Oct 3, 2023
evankirkiles
changed the title
Vercel Build Cache prevents On-Demand Revalidation from persisting across deployments
Vercel deployment / build pipeline using filesystem cache handler, not Vercel Data Cache
Oct 4, 2023
evankirkiles
changed the title
Vercel deployment / build pipeline using filesystem cache handler, not Vercel Data Cache
[WORKAROUND] Vercel deployment / build pipeline using filesystem cache handler, not Vercel Data Cache
Oct 4, 2023
Figured out a workaround that enables the build pipeline to hook into the Data Cache by dynamically reading SUSPENSE_CACHE_ environment variables from a route handler, populating the build script process's environment with them, and then running yarn build. It's hacky, but it solves the data regression problem for now until Vercel gets around to enabling the Data Cache in the build pipeline:
Link to the code that reproduces this issue
https://github.com/evankirkiles/nextjs-broken-data-cache-demo
To Reproduce
source ./get-sc-creds.sh;
part of the build scriptTime A
.Time A
.currHash
locally inpage.tsx:5
to something differentTime A
.In the above pictures, see that the two times
3:35:18 PM
and3:35:09 PM
, which came from manual revalidation of thetime
tag, are forgotten after the new deployment using the build cache. (The tabular data is saved in localStorage to persist across deployments).To Workaround
source ./get-sc-creds.sh;
part of the build scriptREADME.md
and re-deploy.Time A
.Time A
.currHash
locally inpage.tsx:5
to something differentTime A
, and rather is the most recently-fetched time from the manual revalidation.Current vs. Expected behavior
Current Behavior:
Deployments using the build cache only use data fetched during the current and previous build processes, not data from the Data Cache. Essentially, manually-revalidated data (from
revalidateTag
, for example) is forgotten after a new deployment using the build cache.Expected Behavior:
New deployments using the build cache should use the most up-to-date data from the Vercel Data Cache, not just from previous build processes.
Thoughts on the Issue
EDIT: The problem is caused by the environment variables
SUSPENSE_CACHE_URL
,SUSPENSE_CACHE_ENDPOINT
, andSUSPENSE_CACHE_AUTH_TOKEN
not being set in the deployment pipeline. Without these, the fetch cache is not used, and the build pipeline always resorts to the filesystem cache in.next/cache/fetch-cache
from the downloaded Build Cache.Proposed Solution
EDIT: Expose the
SUSPENSE_CACHE_URL
,SUSPENSE_CACHE_ENDPOINT
, andSUSPENSE_CACHE_AUTH_TOKEN
environment variables to the build pipeline in the same way that they are set for route handlers in thectx._requestHeaders
passed into the constructor for theCacheHandler
.Workaround
In our build script, we somehow need to get
SUSPENSE_CACHE_URL
,SUSPENSE_CACHE_ENDPOINT
, andSUSPENSE_CACHE_AUTH_TOKEN
. These are only easily acquirable on the user-side from an existingFetchCache
CacheHandler
. We can get one of these from within a route handler on the globalglobalThis.__incrementalCache
object. So we can populate these environment variables with three things:build
process's environment variables dynamicallybuild
script inpackage.json
to call the bash script.This works, and once you set it up you don't have to touch it again. But it would be much better if Vercel would do this for us.
For sample code, see: https://github.com/evankirkiles/nextjs-broken-data-cache-demo.
Verify canary release
Provide environment information
Operating System: Platform: darwin Arch: x64 Version: Darwin Kernel Version 22.5.0: Mon Apr 24 20:51:50 PDT 2023; root:xnu-8796.121.2~5/RELEASE_X86_64 Binaries: Node: 20.3.1 npm: 9.6.7 Yarn: 1.22.19 pnpm: N/A Relevant Packages: next: 13.5.4-canary.11 eslint-config-next: 13.5.4 react: 18.2.0 react-dom: 18.2.0 typescript: 5.2.2 Next.js Config: output: N/A
Which area(s) are affected? (Select all that apply)
Data fetching (gS(S)P, getInitialProps)
Additional context
No response
The text was updated successfully, but these errors were encountered: