Using user defined environment variables with dynamic Environments or multi-project pipelines

Hi,

I want to use environment variables that are dynamically defined by a pipeline job and shared between jobs using artifacts:reports:dotenv. This works fine, I can see and use those variables on script: sections but I have noticed that they cannot be used with some other sections, like environment: from the Environments and trigger: from multi-project pipelines features. When I use one of the predefined CI variables those two features work as expected, for example when declaring an environment:

deploy_app:
  stage: deploy
  environment:
    name: feature/${CI_COMMIT_REF_NAME}
    url: https://${CI_ENVIRONMENT_SLUG}-app.domain.com
    on_stop: stop_app

works, but if I use a user defined variable they donโ€™t work:

deploy_app:
  stage: deploy
  environment:
    name: aws/${APP_ENVIRONMENT}
    url: ${APP_URL}
    on_stop: stop_app

so it seems as if user defined environment variables are not visible to those keywords. Is this expected ? if yes, is there a way to work around it ?

We are using hosted gitlab.com with our own runners (version 13.12.0).

Thanks in advance.

I did some testing and could not have the name read from a dotenv file. Appears that only the URL can be set that way.

stages:
- stage01

job01:
  stage: stage01
  script:
    - echo "DYNAMIC_ENV_URL=http://dynamictestenvironment1" >> deploy.env
  environment:
    # Name cannot be dynamically set from file
    name: $CI_PIPELINE_ID # can use CI + other vars
    # URL can be dynamically set from file
    url: $DYNAMIC_ENV_URL
  artifacts:
    reports:
      dotenv: deploy.env
1 Like

I have this same issue. I am trying to configure a mutli-project pipeline to use a variable generated by the parent pipeline(which gets put in a dotenv report artifact) to name the environment that gets created in the child repo.

It seems that this isnโ€™t possible as noted above, but Iโ€™m going to try to find a way to work around it. Maybe using the Gitlab API or something similar might work?

If anyone here has found a workaround for this please post it. If I find a way to accomplish this I will post it here.

I had the same issue but I solved it using dynamically generated child pipelines that are created on runtime.

That means that you can create a new generated-gitlab-ci.yml on runtime, store it as artifact and trigger it in the next job. The diagram below shows how those pipelines work and this video from GitLab explains the approach.

For a more in-depth article see this article at medium.

Consider the following .gitlab-ci.yml:

default:
  image: node:16

stages:
  - build
  - trigger

build:
  stage: build
  script: 
    - node ./create-pipeline.js # this line generates our pipeline
  artifacts:
    expire_in: 1 week
    paths:
      - dist/
      - dynamic-gitlab-ci.yml

trigger:deploy:
  stage: trigger
  needs:
    - build
  trigger:
    include:
      - artifact: dynamic-gitlab-ci.yml # this file is generated on runtime in the build stage
        job: build
    strategy: depend

In the build job, a new pipeline file is written to disk called dynamic-gitlab-ci.yml. This file is triggered in in the trigger:deploy job. The output is as follows:

In the build job, there is a custom script. This script generated the new pipeline file, that contains runtime variables, even the job name now can be a variable:

deploy:${environment}:
  stage: deploy
  image: docker
  environment: 
    name: ${environment}
  variables:
    ENVIRONMENT: ${environment}
  services:
    - name: docker:dind
      alias: docker
  before_script:
    - docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY
  script:
    - echo 'deploy to ${environment} ๐Ÿš€๐Ÿš€!'
    - docker build --build-arg ENVIRONMENT=${environment} --pull --cache-from $CI_REGISTRY_IMAGE --tag $CI_REGISTRY_IMAGE/${environment} .
    - docker push $CI_REGISTRY_IMAGE/${environment}

See this repository for an implementation.