Optional jobs with dynamic job names for needs

Hi

I’m currently trying to implement pipelines with various optional jobs and not entirely sure whats the best approach to tackle this problem.

The general requirement for the pipeline is as follows:

 [build FOOBAR] -> [optional job 1] -> [optional job 2] -> [optional job n] -> [final job]  

Given the above pipeline, a user should be able to decide if an optional job is run by setting an environment variable before triggering the pipeline. All jobs do produce artifacts that should be passed to the next job.

In order to keep the yml file modular I am using triggers with includes. However, currently I am not able to set the actual job name dynamically.

The following yml is used within the actual repository I want to build:

stages: 
  - build
  - postbuild

build FOOBAR:
  stage: build
  script:
    - make
  artifacts:
    paths: ["artifacts/*"]

post-build-job:
  stage: postbuild
  trigger:
    include:
      - project: 'group/pipeline-master'
        file: 'main.yml'
    strategy: depend  
  variables:
    PARENT_PIPELINE_ID: $CI_PIPELINE_ID

group/pipeline-master: main.yml

optional job 1:
  stage: postbuild
  only:
    variables:
      - $PARENT_STAGE == 'postbuild' && $USER_SUPPLIED_VARIABLE =~ /.*optional job 1.*/
  trigger:
    include:
      - project: 'anothergroup/job1'
        file: 'other.yml'
    strategy: depend
  
  variables:
    PARENT_PIPELINE_ID: $PARENT_PIPELINE_ID


optional job 2:
  stage: postbuild
  only:
    variables:
      - $PARENT_STAGE == 'postbuild' && $USER_SUPPLIED_VARIABLE =~ /.*optional job 2.*/
  trigger:
    include:
      - project: 'somegroup/job2'
        file: 'job2.yml'
    strategy: depend
  
  variables:
    PARENT_PIPELINE_ID: $PARENT_PIPELINE_ID

So far everything works fine. The problem now is to determine the value of the job keyword in the following yml since it depends on which optional job was run before:

optional or final job:
  stage: test
  script:
    - echo "same problem exists for all optional jobs"

  needs:
    - pipeline: $PARENT_PIPELINE_ID
      job: ???
      artifacts: true
  
  artifacts:
    paths: ["artifacts/*"]
    expire_in: 1 week 

Since every stage needs to process the previous stage artifacts, it probably be possible to do everything within each script block by querying the gitlab API and downloading the artifacts of the previous job. Another solution might be to use child pipelines using dynamically generated configurations. However, I really would like to avoid both of them and solve the problem within actual yaml.

I really appreciate all kinds of input