Ideas for building from subdirectories

Replace this template with your information

Hi there,

I am looking for some ideas on how to build quite a number of docker images from a single repository. The background is the images are deployed to k8s afterwards as payload - and they consist of a base image where all necessary boilerplate code lives in and the actual script which most often is a single python file.

Of course, I could go and spin up a new repo for every script file - which seems to be too much overhead.

Next ideas was just to build every image every time, but that results in too much build time and or space used in the container registry.

So is there a clever idea to trigger a build for every directory with changes and upload it to the registry?

Best regards,
Jan

So is there a clever idea to trigger a build for every directory with changes and upload it to the registry?

Yeah there is, and it is not that clever :stuck_out_tongue:

Let’s say you have 3 images and folders with Dockerfiles and all you need to build them: let’s say you named them image1, image2 and image3. Then you’ll have a single .gitlab-ci.yml in the root.

build_image1:
  only:
    changes:
      - image1/**/*
      - .gitlab-ci.yml # this will build the image whenever .gitlab-ci changes
   script:
     - docker build ./image1
     - docker push
   when: manual # this optional, it will make the job manual and you'll have the control on whether you'd want to build or not

build_image2:
  only:
    changes:
      - image2/**/*
      - .gitlab-ci.yml # this will build the image whenever .gitlab-ci changes
   script:
     - docker build ./image2
     - docker push
   when: manual # this optional, it will make the job manual and you'll have the control on whether you'd want to build or not

build_image3:
  only:
    changes:
      - image3/**/*
      - .gitlab-ci.yml # this will build the image whenever .gitlab-ci changes
   script:
     - docker build ./image1
     - docker push
   when: manual # this optional, it will make the job manual and you'll have the control on whether you'd want to build or not

Using this setup whenever you change a folder, the corresponding image will be built. The build will be triggered if you change .gitlab-ci.yml.

(1): The build jobs should have other tags like image and service … you can check out how to build a docker image in gitlab-ci docs.
(2): You can also use something else other than docker (Buildah or kaniko …)
(3): You can replace when and only with rules but I am not sure rules:changes supports file globs.
(4): You can remove manual and you can control when to build using rules or only

Hope this helps, if it doesn’t feel free to follow up with questions about your use case.

The only: changes: ... approach is good but the problem is that the subdirectories are not self-contained. You need to reference each one in the root .gitlab-ci.yml file too. Is there a way for the root CI file to automatically run all pipelines defined in subdirectories? Something similar to how test discovery happens in Python for instance: tests don’t need to be registered anywhere, you just add them in the appropriate subdir and they will be run automatically.