I have a number of repositories whose compilation produces artifacts. I want to achieve the following behaviour: after each build job, I want to rebuild an archive containing all the individual artifacts.
At the moment, I can think of creating an additional job that I would need to trigger manually that would fetch the latest artifacts from all relevant repos, rebuild the archive and push it somewhere. This solution makes me unhappy due to “trigger manually”. Is there some way to configure the runner to do that work automatically after every job?
I would maybe use a notify timeline here, like
- There’s a meta job which counts the repositories (e.g. via API), puts that as
count file into your artifacts storage
- first job puts the artifacts somewhere, e.g. a directory for this repository on an S3 cache
- second job does the same
- Another repo, another job - directory for the repo in S3
- The last job of each pipeline adds a
completed token which includes a timestamp
- Each of these last jobs reads the
count token, and compares how many completed tokens are there. If they match, it generates an archive and triggers additional wanted actions.
If you know the repository count before, you might also hardcode that in the script being run after the last job in each pipeline.
A little complicated, but that’s what comes first to mind. On older systems, we’ve built backup jobs in the same way
So in simple words, you’re suggesting that I place archive-building code in the last job of every pipeline. I thought of that, it seems a bit tedious albeit doable.
that’s what came to my mind when reading the topic. Currently I don’t know of any “group specific” CI action triggered when a certain condition is reached. But that would likely be an interesting feature request, mind opening one at https://gitlab.com/gitlab-org/gitlab/issues ?