Hello, I have a gitlab instance that an application uses to store content across hundreds of gitlab projects. The projects are divided up by entities in the application. I would like to copy this data to S3 when changes are made to the project. Every time a change is made to the project, how can I upload that version to an S3 bucket and directory?
I’ve thought about using a runner to run every time a project is updated. The runner would clone the repo, or fetch, and then copy it to S3. How could I trigger this for all projects? The projects can’t have a .gitlab-ci.yml file added to them.
I’ve also thought about running a python script on a cron job that would watch for changes across the projects using the GitLab API, clone the changed project, and then upload that to S3. I’ve managed to get a list of projects from the API using python, clone those and upload those to S3, but I don’t know how to automate git clone vs git pull or how to check for latest changes across the projects.
Thank you for your help!