Infinite push in a Job of a pipeline in GitLab CI

Hi everyones,

I manage the next diagram in aws infrastructure with repos and submodules in gitlab and pipelines en CI.

Currently, I require update every project when his update a submodule by the pipeline with Job for this from a runner Shell.
But I don’t have a idea how to finish this request?

I share my codes in the pipeline.

Submodules:

variables:
    TEST_VAR: "Begin - Update all git submodule in projects."

stages:
    - build
    - triggers

job1:
    stage: build
    script:
        - echo $TEST_VAR

trigger_A:
    stage: triggers
    when: on_success
    trigger:
        project: pruebas-it/proyecto_a
        branch: main

trigger_B:
    stage: triggers
    when: on_success
    trigger:
        project: pruebas-it/proyecto_b
        branch: main

Projects:

variables:
    TEST_VAR: "Begin - Update all git submodule in projects."
    COMMIT: "GIT"
    CHANGES: $(git status --porcelain | wc -l)

stages:
    - build
    - test
    - deploy

job1:
    stage: build
    script:
        - echo $TEST_VAR
        - echo $COMMIT

job2:
    stage: test
    extends: .deploy-dev
    only:
        variables: [ $CHANGES != "0" ]

job3:
    rules:
        - if: $CI_COMMIT_MESSAGE == "0"
    when: always
        
    stage: deploy
    before_script:
        - 'which ssh-agent || ( apt-get update -y && apt-get install openssh-client -y )'
        - eval $(ssh-agent -s)
        - echo "$SSH_PRIVATE_KEY" | tr -d '\r' | ssh-add - > /dev/null
        - ssh -T git@gitlab.com
        - mkdir -p ~/.ssh
        - chmod 700 ~/.ssh
        - git config --global user.name "${GITLAB_USER_NAME}"
        - git config --global user.email "${GITLAB_USER_EMAIL}"
        - echo "${CI_COMMIT_MESSAGE}" "${GITLAB_USER_EMAIL}" "${CI_REPOSITORY_URL}" "$CI_SERVER_HOST"
        - url_host=$(echo "${CI_REPOSITORY_URL}" | sed -e 's|https\?://gitlab-ci-token:.*@|ssh://git@|g')
        - echo "${url_host}"
        - ssh-keyscan "$CI_SERVER_HOST" >> ~/.ssh/known_hosts
        - chmod 644 ~/.ssh/known_hosts
        - git submodule sync
        - git submodule update --remote

    script:
    - git branch
    - git config remote.origin.fetch "+refs/heads/*:refs/remotes/origin/*"
    - git fetch origin
    - git checkout main
    - git config pull.rebase false
    - git pull
    - echo 1 >> update.txt
    - git status
    - git add -A
    - git commit -m "$COMMIT"
    - git push "${url_host}"

.deploy-dev:
    script: exit

The output of the before is: that push the commit for update the submodules in every project, I have a infinite push of running Jobs in a cicle without finish the pipeline.

Somebody can help me to undestand, why I don’t have a exit to goal with this? please!!!

Thank’s for the attention to the present.

1 Like

cool this is like super cool

1 Like

Please format the CI/CD config using code blocks for better readability: Community, first steps: Code, config, log block formatting in topics and replies

The jobs seem to trigger themselves in a loop again when a submodule is updated. Maybe it helps to use rules to specify when a pipeline is being run, for example only on the main branch. More in Choose when to run jobs | GitLab

2 Likes

Thank’s Engineer, I do the format CI/CD.

With Rules I don't a example that can give me a solution or idea for this.

This is my Rules Codes, But when the trigger is downstream the pipeline in every project, In the Upstream the job 3 don't can execute with this conditions.
job2:
    stage: test
    extends: .deploy-dev
    only:
        variables: [ $CI_PIPELINE_SOURCE == "push" ]

job3:
    stage: deploy
    variables:
        DOCKERFILES_DIR-A: './submodulo-a'  # This variable should not have a trailing '/' character
        DOCKERFILES_DIR-B: './submodulo-b'  # This variable should not have a trailing '/' character
    rules:
        - if: $CI_COMMIT_BRANCH
          changes:
            compare_to: 'refs/heads/main'
            paths:
                - '$DOCKERFILES_DIR-A/*'
                - '$DOCKERFILES_DIR-B/*'

Rules

2 Likes