Pre-Populate the Gitab CI Cache

I currently work on a few projects that require quite a bit of external to be processed for the compilation.

I have the issue that the required data is rather large. The download of it results in a SIGKILL being sent to the process due to a timeout. While I can control quite a bit in the environment, certain timeouts are preset and cannot/must not be changed.

Now I was wondering if it would be somehow possible to populate the Gitlab Cache directly, e.g. via API to ensure that the data is available for a pipeline without having a job do the setup. Atleast in the openapi.yaml [1] file I have not found anything yet.

Is there potentially another way a “normal” user could do this?

Cheers

[1] doc/api/openapi/openapi.yaml · master · GitLab.org / GitLab · GitLab

cache for CI is not stored in GitLab instance, but on the Runner itself or in distributed cache (S3,GCS,Azure).

The possible solutions depend on what kind of Runner you are using, but in general if you have huge external dependencies just create your own container image with the dependencies included and use that for your jobs.

I looked into building the image but I wanted to avoid that but it looks like this is the only feasible solution for now.

I highly doubt that I get access to the S3 backing the Gitlab Cache to push my stuff there ^^