Hi all,
i hope i will find some help with the following situation: We have been using gitlab for quite a while now. We have a project with two branches (dev & prod). With the purchase of a new pice of software we needed to establish a synch between our gitlab repositories and one of our s3 buckets. I came across the CI/CD context (I have zero expertise in this area) and started playing around with the .gitlab-ci.yml file. I managed to connect to the bucket and even to synch files between one of our branches and the bucket (basically uploaded the files to s3).
The thing i cant get done / dont know how to do it is dealing with the fact that we have two branches in gitlab. Depending on whats happening in gitlab the synch needs to upload in one or another folder in the bucket (we have a prod and a dev folder in the s3 bucket). To be precise: When i merge / push code into dev, the pipeline needs to synch / upload the changes to the dev folder in the bucket. When i merge the dev branch into the prod branch it needs to cynch / upload the changes into the prod folder in the bucket.
Whats the best way to approach this ? Do i use one .yml in dev and one in prod ? Or do i work with one .yml which i would treat like any other source in my repo ? I was thinking i could go with the latter approach and be somewhat dynamic in the .yml by using ${CI_COMMIT_BRANCH} to get the current branch and take the right folder depending on where the pipeline is triggered.
I hope my problem became (at least on a basic level) clear and i will find some help in this forum.
Thanks a lot in advance!