How to implement continuous delivery with our GitLab Monorepo structure?

Hello !
In my team, we are far from being DevOps ninjas, and we are considering the options we have to implement CD on our GitLab Monorepo.
That’s why i’m posting here, for some help and advices :slight_smile:

Let’s have a look at our GitLab repository structure and how we work with it :

  • We use GitLab for version control of the codebase of many APIs that we need to deploy on our production server
  • Each single API uses a powershell file, and one or more javascript files, all stored in a specific folder.
  • Each single API has its own life cycle : creation, modification… and sometimes deletion when deprecated
  • To have a better understanding of our repos, let’s consider our folders and files :



Tomorrow, someone in our team will have to code a “My-Api-m” and at last deploy it on our prod server.
Or this someone will have to modify the existing “My-Api-2”, and deploy it.

Today, what happens when we push a new API : our CI pipeline start code linting and unit testing on all our APIs in the repos,
going this way with CD will result in the deployment of all the APIs at each push => this we don’t want !

I’ve read some stuff about the use of wildcards to include child .YML files (planned for a future GitLab release), but today, which options do we have ?

You can setup rules for jobs which would run the job only if some files are changed.
This way you can have a separate unittest and deploy job for each API you maintain in your repository which is executed only if specific API is changed.

You can setup a hidden job with common parameters and use it to extend actual jobs.

  - unittest
  - deploy

  stage: unittest
    - if: $CI_PIPELINE_SOURCE == "push" && $CI_COMMIT_BRANCH != "master"
        - $APIDIR/*
    - when: never

  stage: deploy
    - if: $CI_PIPELINE_SOURCE == "push" && $CI_COMMIT_BRANCH == "master"
        - $APIDIR/*
    - when: never

My-Api-1:unit test:
  extends: .unittest_template
    APIDIR: 'My-Api-1'

  extends: .deploy_template
    APIDIR: 'My-Api-1'

Thank you for your quick answer !

This would probably work for existing APIs : hidden jobs using the changes: option with a variable, and a job by API setting the variable value. But what about pushing a new API ? The dev would have to add a job in the YAML config file each time he wants to submit a new API ?

Basically yes. Each API would need to go with a new Job definition in YAML config. This approach has its pros&cons. For example you can control your Jobs. What’s getting tested and what’s getting deployed in case you have some release management. Disadvantage would be couple of lines that needs to be added, which is IMHO easily documentable.

On the other hand are the child pipelines (and dynamic child pipelines) you have mentioned. It is important to know that child pipelines are new feature which still lacks some functionality most teams expect. For example you cannot download job’s artifacts and reports: generated in child pipelines does not work in MRs. So running any unit or integration testing is hard to do with child pipelines unless you like to read Job outputs.

I am using both approaches, depending on what is really needed. In one case I even have a scheduled pipeline with a job that generates YAML configs using Bash script and pushes it to the same repository, basically overwriting itself.

Well, i haven’t given up on the full automated solution yet :slight_smile:
So far i was considering several options :

First scenario :

Second Scenario:

  • Upon committing, the dev specify the API name in the commit message in a way it can be retrieved later :

“My commit blah blah

Unfortunately, i don’t think that a substring from $CI_COMMIT_MESSAGE can be easily regexp-exctracted and set to a pipeline variable by using only the YAML file,
i would have to go thru a script to achieve this, and speaking of scripting, i would rather go for the next scenario :

Third Scenario:
Using the git diff command to find out what was really modified by the last commit :
git diff --raw HEAD^1 --name-only
would give me something like :
I did a small Powershell script to achieve substring extraction and removing duplicate names and i am able to create a variables.env artifact file with

But i am certainly missing something because the YAML implementation is not working. I have a first job :

`stage: narrow

   - .\build.ps1 -Tasks 'narrow'
      dotenv: variables.env`

This first Job is working and variables.env contains 1 or more variables named TANAMEx.
Next i created another job :

  stage: test
      - .\src\$TANAME1\*
      - .\src\$TANAME2\*
      - .\src\$TANAME3\*
      - .\src\$TANAME4\*
      - .\src\$TANAME5\*
      - .\src\$TANAME6\*
      - .\src\$TANAME7\*
      - .\src\$TANAME8\*
      - .\src\$TANAME9\*
      - .\src\$TANAME10\*
   - .\build.ps1 -Tasks 'analyze','test'

But i should be missing something because this job is not showing in the pipeline and never get triggered…

In your first scenario a developer still needs to create a ci yml file, so why not just add steps to already existing one? You have all the git history and git blame to figure out who added what and when.
But if you don’t want to have separate ci yml files (or any at all) you can use the dynamic child pipelines I have already mentioned to get it working.

  ..other stuff..
    - bash > .child-pipeline.yml
      - .child-pipeline.yml

trigger child:
      - artifact: .child-pipeline.yml
        job: create-pipeline

Magic Bash script bash will generate ci yml file that would contain jobs similar to ones in How to implement continuous delivery with our GitLab Monorepo structure? - #2 by balonik for each subdirectory it finds. And even if you would have 150 jobs in that file thanks to the built-in feature if changes: you don’t have to extract that information yourself

In this case it is fully automatic and nobody needs to edit or create any ci yml files.

EDIT: I am not familiar with Powershell so I can’t help in that area.

1 Like