Data sharing between multi-project pipelines

Sharing between multi-project pipelines

I’m wondering if it is possible to share artifacts between jobs in a multi-project pipeline. I’ve read through this documentation and it only mentions the ability to share environment variables.

To make it more clear as to why I would want this here is an example of the use-case.

I have one project (Project B) which generates images for hardware used in multiple other projects. This image generation requires a number of configuration files. I’d like to be able to generate these configuration files in project A, send them to Project B for processing, and receive the result (i.e. the image) back in project A. For some technical reasons I won’t go into now it is not possible to include Project B directly into Project A and solve this that way :frowning:.

B.t.w. I’m focusing on artifacts right now but if anyone knows of another way of doing this it I’m also interested, as long as it does not involve duplicating the code in project B.

As of right now your best bet is to do a multi project pipeline trigger the other way around as well.

Basically triggering in and out of Project B

I had completely forgotten this post.

Thanks @Blackclaws that is indeed exactly what I ended up doing.

For future reference for others this is what I put in the yml file to get this going.

    trigger:
        project: <PATH_TO_THE_REPO_OF_JOB_1>
        branch: <BRANCH_YOU_WANT_TO_USE>
        strategy: depend

The only thing that I could not figure our was sharing artifacts between the 2 jobs, I thought I ended up finding that was not possible. For now I’ve used a workaround by uploading and downloading the required build output through a server.

The artifact sharing should work but it might be restricted to Premium and above.

full sample ?