Gitlab CI: How do I use the environment variable from one stage as needs:project ref in another

I have two jobs in the same project: job A and job B.

job A creates an environment variable EXTERNAL_PROJ_REF=some_tag and exports it through a .env file.

job B needs to download artifacts from an external_project and package them with other artifacts from the current project. I want to be able to dynamically choose the commit reference from which these external artifacts get downloaded. I am trying to use the environment variable EXTERNAL_PROJ_REF as the ref for external_project needed by job B.

job A:
  stage: build
  script:
    - echo "EXTERNAL_PROJ_REF=`./generate_variable.sh`" > build.env  # evaluates to EXTERNAL_PROJ_REF=some_tag
  artifacts:
    reports:
      dotenv: build.env

job B:
  stage: package
  script:
    - ./do_packaging_job.sh
  needs:
    - project: external_project
      ref: $EXTERNAL_PROJ_REF
      job: external_job
      artifacts: true

When I run this pipeline though, job B instantly fails with the following error:

This job depends on other jobs with expired/erased artifacts:

If I hardcode ref to some_tag, the job does not fail, and I can confirm the EXTERNAL_PROJ_REF is successfully passed to job B.

job B:
  stage: package
  script:
    - echo "Ref = $EXTERNAL_PROJ_REF"  # Correctly prints "Ref = some_tag"
    - ./do_packaging_job.sh
  needs:
    - project: external_project
      ref: some_tag
      job: external_job
      artifacts: true

However, when I have ref:$EXTERNAL_PROJ_REF, the pipeline fails. Can somebody tell me if I’m missing something?

1 Like

I think your first job should use paths not reports:

job A:
  stage: build
  script:
    - echo "EXTERNAL_PROJ_REF=`./generate_variable.sh`" > build.env  # evaluates to EXTERNAL_PROJ_REF=some_tag
  artifacts:
    paths:
      - build.env

reports are for when you want the results of a job (for example, a set of unit tests) to be available via the UI in the merge request or pipeline pages on GitLab. paths are for when you want a file to be passed down the pipeline to subsequent jobs.

1 Like

Thanks for the suggestion, @snim2. It didn’t work though; the problem still maintains.

Can you download the artifact?

1 Like

Yes, the artifacts get downloaded, but the environment variable still seems to be unavailable at the time I need it

I’m not sure if I’ve understood you correctly. Is this what you’re aiming for?

1 Like

No, not quite. What you’re showing me works fine. The variable is available in the script: section of job B.

What I would like is for the variable to be available in the needs: section of job B (which seems to get used before the scripts section. I would like to use the environment variable to tell job B what artifacts to download from an external project.

So:

job B:
  stage: package
  script:
    - whatever command(s) here
  # Artifacts from the needs section get downloaded before above script is run
  needs:
    - job: job A
      artifacts: true
      # Below is an external project from which I need artifacts to run this job
    - project: external_project
      ref: $ENVIRONMENT_VARIABLE  # This should specify what tag/commit to download artifacts from
      job: external_job
      artifacts: true

Ah, right! I really think that it’s highly unlikely that that would be possible.

The more usual way to do this is to use a multi-project pipeline and so the “external” project in your example would have its pipeline run first, and then it would trigger job B and receive the right artifacts automatically.

Is that a possibility in your situation?

I’ll see if I can restructure the pipeline(s) to make this possible

1 Like

I finally realized Gitlab does not support what I want to do, at least not this way. According to this link, a variable passed from a different job can only be used in before_script, script or after_script sections of a job; it cannot be used to configure jobs. I cannot use it the needs section of job B.

Luckily, I found a simple workaround using the Gitlab API. I have API access to external_project, so I just use wget to download the artifact I need from the dynamically selected commit reference. Afterwards, I directly pass the artifact to job B.

job A:
  stage: build
  script:
    # Dynamically create commit reference
    - EXTERNAL_PROJ_REF=`./generate_commit_ref.sh`
    # Download artifact with Gitlab API
    - wget --header "PRIVATE-TOKEN:${EXTERNAL_PROJ_TOKEN}" --output-document outputFileName.file "${CI_API_V4_URL}/projects/${EXTERNAL_PROJ_ID}/jobs/artifacts/${EXTERNAL_PROJ_REF}/raw/${EXTERNAL_ARTIFACT_PATH}?job=${EXTERNAL_JOB_NAME}"
    # btw CI_API_V4_URL is a gitlab defined variable
  artifacts:
    paths:
      - outputFileName.file

job B:
  stage: package
  script:
    - ./do_packaging_job.sh
  needs:
    # Now this packaging job only needs job A. It doesn't need the external job anymore
    - job: job A
      artifacts: true
1 Like

Ah right. The other option is to dump the variable to a file in the first job, then read it back in the next job.

Hi,

I’m following the example provided here:

and I want to pass a variable from one job to another.

If I use the following:

job_a:
stage: setup
image:
name: docker_image_name
tags:
- docker
script:
- export ALERT_ID=12345
- echo ${ALERT_ID}
- echo “${ALERT_ID}” > build.env
artifacts:
paths:
- build.env

it works fine and ‘12345’ value is saved in the build.env file.

If I want, however, to pass a variable like this:

job_a:
stage: setup
image:
name: docker_image_name
variables:

tags:
- docker
script:
- export ALERT_ID=$(python -c ‘python_code_here’)
- echo ${ALERT_ID}
- echo “${ALERT_ID}” > build.env
artifacts:
paths:
- build.env

ALERT_ID is evaluated and shown in the pipeline correctly by this line - “echo ${ALERT_ID}” , but the build.env file is blank and artifacts are not saved.

Regards,
Elena

Hi @Elena

As you’ll see from this thread, you can do this manually, as in my example code, or with dotenv reports if you just have a simple pipeline.

I tried out both methods today:

stages:
    - stage1
    - stage2

stage1:
    stage: stage1
    image: python:3.8-slim
    script:
        - export ALERT_ID=$(python3 -c "import random; print(random.randint(0, 9999))")
        - echo "${ALERT_ID}"
        - echo "${ALERT_ID}" >build.env
        - echo "ALERT_ID=${ALERT_ID}" >dotenv.env
    artifacts:
        paths:
            - build.env
        reports:
            dotenv: dotenv.env

stage2cat:
    stage: stage2
    image: python:3.8-slim
    script:
        - ALERT_ID="$(cat build.env)"
        - echo "${ALERT_ID}"

stage2dotenv:
    stage: stage2
    image: python:3.8-slim
    script:
        - echo "${ALERT_ID}"

And both worked. The log from stage2cat said:

...
$ ALERT_ID="$(cat build.env)"
$ echo "${ALERT_ID}"
2081
Cleaning up project directory and file based variables
Job succeeded

And stage2dotenv:

$ echo "${ALERT_ID}"
2081
Cleaning up project directory and file based variables
Job succeeded

I notice that whilst this is a free tier feature for simple pipelines, in multi-project pipelines it’s a premium tier feature. I’m not sure if that is relevant to you.

Thank you for the detailed example!

It’s very strange. I just tried exact the same code as yours and again the artifacts are empty :frowning:
Here is the output of the job:


What might be the difference in my case? I’m using Gitlab - GitLab Enterprise Edition 14.7.7-ee

1 Like

When I started your sample initially, I received the error:

WARNING: dotenv.env: no matching files ERROR: No files to upload

I suppose the problem may come from incorrect creation/permissions of the build.env/dotenv.env file.
Should I create it myself, if yes - where?

1 Like

Are you using your own runners or GitLab runners? And did you use the python:3.8-slim image?

Hi,
We are using a custom runner.
I just tried your example and it works with ‘python:3.8-slim’ image, but not with my docker image.

I’ll continue to test further and not sure what may cause this problem.

Regards,
Elena

I just tried to use my custom image and custom runner and again received the error - “No files to upload”.

Where these files should be located(build.env/dotenv.env)? Should I create them manually?

1 Like

Those files are created in the pipeline, in the lines:

        - echo "${ALERT_ID}" >build.env
        - echo "ALERT_ID=${ALERT_ID}" >dotenv.env

You don’t really want to create the manually. If your YAML is exactly the same as mine, except for the image: ... declarations, then these jobs should work.

If they don’t, then I would suggest running your Docker image locally, and going through the steps in the stage1 job manually, and checking that each one works correctly. Usually anything in the script of a job is a Bash command, but maybe there is something in the way that your Docker container is set up that is different to the python:3.8-slim image?

Thank you very much for the support!

I made it working. The problem was that before executing this line - - echo “${ALERT_ID}” >build.env, I was in a nested directory of the docker image. When I returned back to the root and executed the line above - it runs successfully.