Hello, I’m encountering a challenge. My objective is to have my pipeline also include other repositories into the working directory. For some repositories (projects) that will never require a full pipeline, I would add a simple .gitlab-ci.yml
file like this:
stages:
- Dependencies
Push DBB:
stage: Dependencies
script:
- |
echo "Pushing repo to the filesystem"
tags:
- mytag
only :
variables:
- $CI_PIPELINE_SOURCE == 'pipeline'
Afterward, I would call it from my main pipeline like so:
DBB:
trigger:
project: /aaa/dbb
branch: main
strategy: depend
This approach works well, as it triggers the DBB project and pushes the repository to the filesystem using my Git runner.
Now, I have other repositories that I also want to push into my working directory. However, these repositories have their own .gitlab-ci.yml
files that I don’t want to execute at the moment; I only want the repository to be pushed or pulled. I thought about creating a separate configuration file in those repositories, let’s call it dependency-pipeline.yml
, and populate it with:
stages:
- Dependencies
Push myRepo:
stage: Dependencies
script:
- |
echo "Pushing Myrepo to the filesystem"
tags:
- mytag
Then, I attempted to call it from my main pipeline, but I realized that the trigger
keyword doesn’t provide an option to specify a different YAML file. So, I tried:
additional_repo1:
stage: Dependencies
trigger:
include:
- project: 'bbb/other-repo'
ref: 'main'
file: 'dependency-pipeline.yml'
However, because it is an include
, it treats it as a “child” instead of “multi-project” and from what I observe it is not pulling the other repo. It just simply runs the yml file. How can I make this work? My goal is to elegantly pull that repository into my working tree without resorting to running git commands from a script if possible."
thank you