I just tried to new ‘parent/child pipelines’ feature.
While I was able to run a simple pipeline which triggers a child pipeline I couldn’t figure out how to exchange artifacts between jobs in the parent pipeline and jobs in the child pipeline.
Can anybody please give me a hint how to do it or is this just not possible (without using cUrl and the API) at the moment?
I don’t know how to do this but I definitely want to be able to. My use case is:
- A single set of common steps that feed into
- Multiple distinct steps, dependent on artifacts from #1, that could be nicely represented by child pipelines.
Without this ability, these are not so much child pipelines as bastards, logically children but completely cut-adrift from the parent.
I am also interested in this feature. Do you have any workaround?
We have faced the same issue. A workaround would be welcomed
I am also struggling with this.
I was able to proof-of-concept a workaround, which was basically to have the first job of the child pipeline manually download the artifacts (using the API calls, which I know OP wasn’t interested in). This has some complications:
The CI_JOB_TOKEN can download artifacts, but only if you know the job ID (and the project ID, which is readily available in the child, and the pipeline ID, which can be passed down via a variable from the trigger job). API calls that search for a jobs by name based on the pipeline ID are unavailable to it (and the call to download by ref name downloads the latest successful job – which by definition isn’t the one currently executing).
My workaround to this complication was to use dynamically generated child pipelines, weaving in the CI_JOB_ID in the job that initially created the content, using sed magic. Note that the child pipeline is now also part of the artifacts.
While it worked in test conditions, the real application of this is ugly in my situation, so I ended up dropping the idea altogether. I’m planning to wait until this feature gets supported directly. Maybe this gives someone else an idea, though.
This doesn’t solve the problem of pulling the artifacts back from the child pipeline into the parent. See https://gitlab.com/gitlab-org/gitlab/-/issues/215725, for instance.
I would say that pulling artifacts back from child to parent is a different feature than just passing artifacts from parent to child on creation of the child pipeline. Without the latter I don’t see much value in parent-child pipelines. I want to have multiple deploy pipelines (pages/docker), but if I cannot pass build artifacts then it is not really useful.
Is this addressed yet? I have the need to use an artifcat made from a child in the parent.
I’m trying to setup a monorepo and have each child configure it’s own build process, but the parent to do the deployment.
We currently have Terraform to do the deployments, and one of the childs creates a lamda function that later on needs to be deployed by the parent. Using a normal pipeline (which is quite big and getting bigger) I was able to create the zip file in a build step, and mark it as an artifact. Later, on the deploy step that file would get uploaded to S3 and the infrastructure updated. But now that I moved the build step as a child, the parent is no longer able to see the child’s output.
I couldn’t figure out how to exchange artifacts between jobs in the parent pipeline and jobs in the child pipeline.
Using the dependencies parameter, you can define a limited list of jobs to fetch artifacts from. This only works when specifying preceding jobs or jobs from stages executed before the current one. See dependencies documentation for more details.
We are also currently implementing ability to inherit environment variables from dependent jobs. This will allow passing environment variables (or other data) between CI jobs using the
dependencies keyword (or
needs keyword for DAG pipelines), if the jobs are sourced with
dotenv report artifacts.
@thaoyeager: Thanks for responding, but I still have no idea how to pass artifacts from parent to child pipelines. The linked documentation does not mention this situation either.
Or are you just saying that it is simple not possible at the moment?
@nmr does this issue address your concerns? It might be best to upvote and include any specific use-cases or needs you have to help refine the needs driving the feature request.
Yes, I think this would be a good solution. Upvoted it (and two related issues as well)
We were able to achieve passing artifact from parent to child pipeline using
cache with cache key as
I would like to use in a child pipe artifacts produced by a sibling child. Did anybody find how to do that?