I have two jobs in the same project: job A and job B.
job A creates an environment variable EXTERNAL_PROJ_REF=some_tag and exports it through a .env file.
job B needs to download artifacts from an external_project and package them with other artifacts from the current project. I want to be able to dynamically choose the commit reference from which these external artifacts get downloaded. I am trying to use the environment variable EXTERNAL_PROJ_REF as the ref for external_project needed by job B.
reports are for when you want the results of a job (for example, a set of unit tests) to be available via the UI in the merge request or pipeline pages on GitLab. paths are for when you want a file to be passed down the pipeline to subsequent jobs.
No, not quite. What you’re showing me works fine. The variable is available in the script: section of job B.
What I would like is for the variable to be available in the needs: section of job B (which seems to get used before the scripts section. I would like to use the environment variable to tell job B what artifacts to download from an external project.
So:
job B:
stage: package
script:
- whatever command(s) here
# Artifacts from the needs section get downloaded before above script is run
needs:
- job: job A
artifacts: true
# Below is an external project from which I need artifacts to run this job
- project: external_project
ref: $ENVIRONMENT_VARIABLE # This should specify what tag/commit to download artifacts from
job: external_job
artifacts: true
Ah, right! I really think that it’s highly unlikely that that would be possible.
The more usual way to do this is to use a multi-project pipeline and so the “external” project in your example would have its pipeline run first, and then it would trigger job B and receive the right artifacts automatically.
I finally realized Gitlab does not support what I want to do, at least not this way. According to this link, a variable passed from a different job can only be used in before_script, script or after_script sections of a job; it cannot be used to configure jobs. I cannot use it the needs section of job B.
Luckily, I found a simple workaround using the Gitlab API. I have API access to external_project, so I just use wget to download the artifact I need from the dynamically selected commit reference. Afterwards, I directly pass the artifact to job B.
job A:
stage: build
script:
# Dynamically create commit reference
- EXTERNAL_PROJ_REF=`./generate_commit_ref.sh`
# Download artifact with Gitlab API
- wget --header "PRIVATE-TOKEN:${EXTERNAL_PROJ_TOKEN}" --output-document outputFileName.file "${CI_API_V4_URL}/projects/${EXTERNAL_PROJ_ID}/jobs/artifacts/${EXTERNAL_PROJ_REF}/raw/${EXTERNAL_ARTIFACT_PATH}?job=${EXTERNAL_JOB_NAME}"
# btw CI_API_V4_URL is a gitlab defined variable
artifacts:
paths:
- outputFileName.file
job B:
stage: package
script:
- ./do_packaging_job.sh
needs:
# Now this packaging job only needs job A. It doesn't need the external job anymore
- job: job A
artifacts: true
ALERT_ID is evaluated and shown in the pipeline correctly by this line - “echo ${ALERT_ID}” , but the build.env file is blank and artifacts are not saved.
...
$ ALERT_ID="$(cat build.env)"
$ echo "${ALERT_ID}"
2081
Cleaning up project directory and file based variables
Job succeeded
And stage2dotenv:
$ echo "${ALERT_ID}"
2081
Cleaning up project directory and file based variables
Job succeeded
I notice that whilst this is a free tier feature for simple pipelines, in multi-project pipelines it’s a premium tier feature. I’m not sure if that is relevant to you.
You don’t really want to create the manually. If your YAML is exactly the same as mine, except for the image: ... declarations, then these jobs should work.
If they don’t, then I would suggest running your Docker image locally, and going through the steps in the stage1 job manually, and checking that each one works correctly. Usually anything in the script of a job is a Bash command, but maybe there is something in the way that your Docker container is set up that is different to the python:3.8-slim image?
I made it working. The problem was that before executing this line - - echo “${ALERT_ID}” >build.env, I was in a nested directory of the docker image. When I returned back to the root and executed the line above - it runs successfully.