Problem to solve
I’m consider adding an infrequent (oner per day or less) CI job which needs an enormous amount of data. I could potentially SFTP it from an external store (S3 etc) when the job starts, but I’m wondering if it would fit in a cache file. (Which I believe is implemented as a .zip)
Steps to reproduce
We need to run a deployment job over 200GB+ of files. They’re in an externally hosted SVN repo (don’t ask….). The workload fits inside an XL runner, but of course I have to get the files into the runner. For a number of reasons, re-cloning the entire SVN repo, each time the job runs, would be problematic.
Configuration
Versions
-
GitLab.comSaaS
Versions
- GitLab (Web:
/helpor self-managed system informationsudo gitlab-rake gitlab:env:info): - GitLab Runner, if self-hosted (Web
/admin/runnersor CLIgitlab-runner --version):
Helpful resources
- The caching docs don’t say anything about size limits. They mention a 30-day expiry window which is fine