How to properly create multi-pipeline CI for one repository

The situation is as follows. We have one repository with scripts that build our system in the cloud infrastructure.
As the application runs on two clouds, it must be two independent pipelines:

  • Deploy to AWS
  • Deploy to Azure
    The developer should be able to handle the pipeline manually.
    For ten moments, it resolves a lot that we have pipeline schedules that run with pumps with values generating CLOUD_ENV.
    The gitlab-ci.yml file looks like this:
workflow:
    rules:
      - if: $CI_PIPELINE_SOURCE == "schedule"
      - if: $CI_PIPELINE_SOURCE == "network"
      - if: $CI_PIPELINE_SOURCE == "trigger"

gradation:
    - triggers

# Build your Azure infrastructure
create_azure_env:
    stage: triggers
    trigger:
      include: azure/CICD/.create-infr-pipeline.yml
      strategy: depends
      forward:
        pipeline_variables: true # pass pipeline variables to child pipeline
    rules:
      - if: $CLOUD_ENV == 'blue'

# Build AWS infrastructure
create_aws_env:
    stage: triggers
    trigger:
      include: aws/CICD/.create-infr-pipeline.yml
      strategy: depends
      forward:
        pipeline_variables: true # pass pipeline variables to child pipeline
    rules:
      - if: $CLOUD_ENV == "aws"

Such solutions works, but when we look at Build->Pipelines list it is hard to find out which clouds the pipeline was launched (for which cloud). What’s more, in the “You work->Operation” dashboard, I can’t see it what is the deployment status for each cloud.

The question is what is proper way to implement several pipelines for one repository

Pipelines UI is not the place where you look for information you need. For this purpose there is “Operate->Environments” (in new 16.0 UI).

Update your CI like this:

...
create_azure_env:
  environment: azure
...
create_aws_env:
  environment: aws
...

and you can see deployments on the Environments page.