I am trying to understand how caching works with gitlab jobs.
- How do i know for sure that caching is being utilized for jobs? It says "
Successfully extracted cache
" but the build time is slower than I would expect - Is using a local cache the fastest way to using the cache? Faster than s3?
We are using a docker based runner with /cache as the cache directory
We have a simple next.js project and this is the current ci file
485 B
deploy:
image: node:20-alpine
stage: deploy
cache:
- key:
files:
- yarn.lock
paths:
- .yarn-cache/
- key: ${CI_COMMIT_REF_SLUG}
paths:
- .next/cache/
script:
- "pwd"
- "ls -lah"
- "ls -lah .yarn-cache ||:"
- "ls -lah .next/cache ||:"
- yarn install --frozen-lockfile --no-progress
- yarn build
- yarn wrangler pages deploy out --project-name "test-app" --branch $CI_COMMIT_BRANCH
when: manual
It’s taking quite a while to build this job and it seems like it should be a alot quicker.
Looking for the cache in the cache directory during the pipelin returns this
$ ls -lah .yarn-cache ||:
ls: .yarn-cache: No such file or directory
$ ls -lah .next/cache ||:
ls: .next/cache: No such file or directory
Does this mean that it can’t find the cache inside the docker container?