Dependent Job Execution

I’m trying to evaluate whether or not GitLab CI is suitable for use within our organization. One of the peculiarities of our builds is that the tasks to be run are computed during the build. To clarify what I mean, let me give a fictitious example:

  1. Build some program (takes maybe half an hour)
  2. Run that program (takes another half hour)
  3. Based on the output of that program, extract a list of what needs to be done next.
  4. For each item in the list, run another task (each task takes 1-6 hours), preferably in parallel

The problem here is that I don’t know a-priori which jobs will happen in step 4. I need to complete step 3 and analyze the output before I know what to do. This is not simply a case of turning on/off some jobs based on an expression; step 3 is essentially creating the jobs.

You might say that I should just update the yaml file whenever the program is changed to reflect the new output. Unfortunately, the ways the output of that program changes are non-obvious to developers, so really I just have to run it.

From what I’ve seen of the GitLab CI API docs, the closest seems to be to run steps 1-3 in one pipeline, then create a new yaml file in a proxy repository and trigger a new pipeline pointing at that yaml file. This could work, but in the real scenario, this happens in quite a few places in the build, so I’m hoping there’s a cleaner way to do this.

Is this something I could add by developing a GitLab CI plugin? ie: could I modify my own running pipeline to add jobs to it, dynamically?

Nevermind. The more I read about GitLab, the more I realize it’s not going to work for us, except maybe as a jenkins-style GUI.

I haven’t used it, but it sounds like you could use GitLab CI triggers:

Once your program computed, which jobs you want to execute, you would trigger them to run. There might be better alternatives for this task though.