Non-redundant pipelines are auto-cancelled

Hi,

since some of our CI jobs need quite a long time to finish, I want to use rules to run them only if necessary (based on changed files). Example configuration:

build-job1:
  stage: build
  interruptible: false
  script:
    - sleep 60
  rules:
    - if: $CI_PIPELINE_SOURCE == "merge_request_event" || $CI_PIPELINE_SOURCE == "push"
      changes:
        - "dir1/*"

build-job2:
  stage: build
  interruptible: true
  script:
    - sleep 30
  rules:
    - if: $CI_PIPELINE_SOURCE == "merge_request_event" || $CI_PIPELINE_SOURCE == "push"
      changes:
        - "dir2/*"

I also want Gitlab to cancel redundant, running pipelines after a new push, therefore Auto-cancel redundant pipelines is activated.

What do I see?

  1. Push a change to dir1build-job1 is triggered
  2. Push a change to dir2 immediately → The first pipeline is cancelled and a new one with build-job2 only(!) is started
  3. The pipeline eventually becomes “OK”.

Therefore, build-job1 is never finished and I would possibly miss a build error!

What do I expect?

I expected the first pipeline to be cancelled only if all its jobs are also contained in the new pipeline. Otherwise, it should continue normally. In a merge request, merging should only be allowed if all jobs triggered by the changes are “OK” (not only the ones triggered by the last push).

My questions

Do I see a bug or a feature? :slight_smile:

How can I achieve my expectation? I do not want to disable auto-cancellation for these long-running jobs. Alternative to not cancelling the first pipeline, it would also be okay to “extend” the new pipeline with the cancelled jobs from the first one, but to me this seems to be more complicated.

I’m on a self-managed Gitlab version 14.3.3ee

Best regards,
Aardjon

1 Like