GitLab CI caching: fails to archive node_modules cache - process killed

Hello, I’m using GitLab Cloud (not self-hosted) and my own K8s executor to run my CI jobs.

Runner version: gitlab-runner 14.0.0~beta.98.gc715666c

Job output:

Done in 289.98s.

Saving cache for successful job
Creating cache yarn-b971104b79f1e8ce3c88a75fd4caca89878a1e9b-1...

node_modules: found 161339 matching files and directories

/bin/bash: line 169: 129 Killed '/usr/bin/gitlab-runner-helper' "cache-archiver" "--file" "../../../../../cache/decisely/juno/yarn-b971104b79f1e8ce3c88a75fd4caca89878a1e9b-1/cache.zip" "--timeout" "10" "--path" "node_modules" "--url" "long url of cached files on S3"

Failed to create cache

Cleaning up file based variables

Job succeeded

CI Job YML:

deps:yarn:
  stage: deps
  image: registry.gitlab.com/decisely/juno/ci-base:latest
  <<: *interruptible
  only:
    refs:
      - merge_requests
      - master
    changes:
      - .gitlab-ci.yml
      - package.json
      - yarn.lock
  variables:
    FF_USE_FASTZIP: "true"
    KUBERNETES_SERVICE_CPU_REQUEST: 2
    KUBERNETES_SERVICE_CPU_LIMIT: 4
  cache:
    policy: pull-push
    key:
      prefix: "yarn"
      files:
        - package.json
        - yarn.lock
    paths:
      - node_modules
  script:
    - yarn install

I tried to search for an option to increase timeout for cache-archiver task, but I couldn’t find anything online. The size of the node_modules directory is rather large, but still - there must be a way to cache that…

Thank you for your help!

Hi @alex-kovshovik
The Killed seems like the platform, in this case K8, killed the process.
The cache is archived to a ZIP first, make sure you give gitlab-runner pods enough cpu/memory in K8.

Hi @balonik Hm, this is a good point! I remember seeing this before, going to increase CPU and memory limit for this job. Good idea! I’ll update this once this is solved.