Plan sub directory CI - few python packages in one repo

,

We want to start a generic pipelines repo that can provide CI for all python repos - let’s call it “CI/CD Python pipeline”
I have repos that consume this repo using their .gitlab.ci.yaml file.
Our next mission is to have a repo, with sub-directories, and every sub-directory on the repo needs a whole CI to consume from my generic pipeline.
What is the best practice to start such CI? How can I make a CI for each folder and trigger it only when code changes in this specific folder?

If I understand the question correctly, the Python project is a monorepo with different components that require specific CI/CD workflows. Lets assume that it has two directories: backend and frontend.

CI/CD jobs should only run when files changes in one or the other directory. This can be achieved using rules:changes

Something like this (untested):

backend-build:
  script:
    - echo "Building backend"
  rules:
    - if: $CI_PIPELINE_SOURCE == "merge_request_event"
      changes:
         - backend/**/*
      when: manual
      allow_failure: true 

frontend-build:
  script:
    - echo "Building frontend"
  rules:
    - if: $CI_PIPELINE_SOURCE == "merge_request_event"
      changes:
         - frontend/**/*
      when: manual
      allow_failure: true 

The GitLab project uses similar workflows to run tests for frontend only, etc. - example in .gitlab/ci/rules.gitlab-ci.yml · 902947724bdb9dbb63529213a7f4c6c1395b1a8e · GitLab.org / GitLab · GitLab

Something like that, but what if it’s not FRONT_END or BACK_END? I’ve solve it right now by using env variable named “package_name” and inherit template CI from main CI/CD repo.
Then each time we call stage, it’s doing cd into $package_name and run CI normally.
Thx any way.