Using user defined environment variables with dynamic Environments or multi-project pipelines

I had the same issue but I solved it using dynamically generated child pipelines that are created on runtime.

That means that you can create a new generated-gitlab-ci.yml on runtime, store it as artifact and trigger it in the next job. The diagram below shows how those pipelines work and this video from GitLab explains the approach.

For a more in-depth article see this article at medium.

Consider the following .gitlab-ci.yml:

default:
  image: node:16

stages:
  - build
  - trigger

build:
  stage: build
  script: 
    - node ./create-pipeline.js # this line generates our pipeline
  artifacts:
    expire_in: 1 week
    paths:
      - dist/
      - dynamic-gitlab-ci.yml

trigger:deploy:
  stage: trigger
  needs:
    - build
  trigger:
    include:
      - artifact: dynamic-gitlab-ci.yml # this file is generated on runtime in the build stage
        job: build
    strategy: depend

In the build job, a new pipeline file is written to disk called dynamic-gitlab-ci.yml. This file is triggered in in the trigger:deploy job. The output is as follows:

In the build job, there is a custom script. This script generated the new pipeline file, that contains runtime variables, even the job name now can be a variable:

deploy:${environment}:
  stage: deploy
  image: docker
  environment: 
    name: ${environment}
  variables:
    ENVIRONMENT: ${environment}
  services:
    - name: docker:dind
      alias: docker
  before_script:
    - docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY
  script:
    - echo 'deploy to ${environment} 🚀🚀!'
    - docker build --build-arg ENVIRONMENT=${environment} --pull --cache-from $CI_REGISTRY_IMAGE --tag $CI_REGISTRY_IMAGE/${environment} .
    - docker push $CI_REGISTRY_IMAGE/${environment}

See this repository for an implementation.