How to get artifacts from any of the job?

I have a pipeline as below.

The above works fine and the deployment job is dependent on create-admin-server as below.


I want to add one more called connect as in below screen.

either connect will be run or create-admin-server will run. But whichever ran, next deployment should run.

How to frame that in dependency of deploy?

By default, jobs in later stages automatically download all the artifacts created by jobs in earlier stages. You can control artifact download behavior in jobs with dependencies.

So, if your deployment job does not have any dependencies or needs, it will just download the artifacts from the build stage, whether the create-admin-server job has run, or the connect job.

1 Like

Build stage might contain other jobs also but deployment job should run only if either of create-admin-server or connect job runs. How to set that?

I’m not sure there’s a straight-forward way to do this.

You can’t use an “or” relationship in your needs or dependencies and you can’t pass variables between stages.

I think your best bet might be to create an artifact in your build stage, then have your deployment jobs check that the artifact exists, and halt if not.

1 Like

ok I will check that, one more doubt, is there anyway to access contents of artifacts from a previous pipeline ran?

Only via the API, I believe.

1 Like

can you share any samples for that.

Also, as part of pipeline, we are sending a ssh pem file content as variable and writing that to a file and connecitng to ec2 instance from that.
It seems, pipeline missing the formatting and remvoing all the line endings and replacing with spaces, any script/commadns to fix the formatting for it?

I haven’t used that part of the API myself, but the documentation is quite complete.

For your PEM file, if you are storing that in the CI/CD variables in your project settings, I would make sure that you use a File type variable, rather than a Variable type.

If you are generating the PEM file within the pipeline, or getting it from somewhere else, it’s worth remembering that your .gitlab-ci.yml file is really just a set of Bash commands with some metadata. So, whatever ssh or Bash would normally do with the whitespace in your file, will happen in your pipeline. You may well need to use a bit of sed or similar to sort out your line endings, but hopefully this is something you can reproduce locally.

1 Like

It is not showing any file browse option when I set a variable as file type, in gitlab pipeline run interface

or relation is giving syntax error

    - create-admin-server
    - connect-admin-server
  needs: [ "create-admin-server" or "connect-admin-server"]

can you please suggest

i tried like this also

    - create-admin-server or connect-admin-server
  needs: [ "create-admin-server" or "connect-admin-server"]

When you go to and expand the Variables section, once you click on the Add variable button you will see a dialog box. There is a drop-down with the label Type where you can select File rather than Variable.

1 Like

Perhaps my answer wasn’t very clear, but there isn’t a way to specify an “or” relationship with either needs or dependencies, the arrays are always “and”-ed together.

1 Like

yes I got that, but when you select the file, you don’t have option to upload any file as part of that, only value field is there which accepts only text

Hi @snim2

I tried the link you shared.

curl --output --header "PRIVATE-TOKEN: <your_access_token>" ""

I replaced the above one with the token I generated to the pipeline jobs from api.
But, it is failing to download, same token is able to trigger the pipeline jobs.

How to fix that?

Also, any option to run the manual steps of previous pipelines using api?

It’s hard to tell what the problem might be, from you have posted, but I would start by checking that your PAT token has the correct permissions. Also, I presume is just from the docs, but that will need to be your on-prem URL.

1 Like

yes the url is different.
The same token is working if want to trigger any pipeline