Hi all.
we have gitlab 13.12.10 running on nodes in aws. Backups runs 3 times a day from different nodes.
This is the command
gitlab-backup create CRON=1 STRATEGY=copy BACKUP=${BACKUP_TIMESTAMP}
We backups all our data to aws S3 bucket and works fine but, as it create local temp files before transfer them to the s3 bucket, we had a few times a file system full. Of course we resized the local disk and was fine for a while but now we had the same issue.
Is there a way to avoid local temp backup? or a different strategy?
Cheers