I currently work on a few projects that require quite a bit of external to be processed for the compilation.
I have the issue that the required data is rather large. The download of it results in a SIGKILL
being sent to the process due to a timeout. While I can control quite a bit in the environment, certain timeouts are preset and cannot/must not be changed.
Now I was wondering if it would be somehow possible to populate the Gitlab Cache directly, e.g. via API to ensure that the data is available for a pipeline without having a job do the setup. Atleast in the openapi.yaml [1] file I have not found anything yet.
Is there potentially another way a “normal” user could do this?
Cheers
[1] doc/api/openapi/openapi.yaml · master · GitLab.org / GitLab · GitLab