yarn.lock file changes. This caches the
node_modules folder and, in general, runs pretty smoothly for all my pipelines.
Unfortunately, it can happen that the cache gets deleted, and because
yarn.lock didn’t change, it also does not recreate the cache. So, all following jobs that depend on the cache will fail.
Can I somehow also start the cache-creation job, if the current cache does not exist? (I just found a manual work around, but this is not my preferred solution. I also look for an environment variable, but didn’t found one.)
default: # https://docs.gitlab.com/ee/ci/caching/ cache: &global_cache key: files: # only generate a new cache if this files changes # same cache over multiple pipelines - yarn.lock paths: - node_modules policy: pull install_dependencies_in_cache: stage: .pre cache: # inherit all global cache settings <<: *global_cache # override the policy policy: pull-push script: - yarn install --frozen-lockfile rules: # only if dependencies changed, refill the cache - changes: - yarn.lock # TODO: trigger if cache does not exist - how? # ... # work-around: trigger manually if cache does not exist - when: manual allow_failure: true