Gitlbab backup

Hi all.
we have gitlab 13.12.10 running on nodes in aws. Backups runs 3 times a day from different nodes.
This is the command
gitlab-backup create CRON=1 STRATEGY=copy BACKUP=${BACKUP_TIMESTAMP}
We backups all our data to aws S3 bucket and works fine but, as it create local temp files before transfer them to the s3 bucket, we had a few times a file system full. Of course we resized the local disk and was fine for a while but now we had the same issue.

Is there a way to avoid local temp backup? or a different strategy?

Cheers

There is no way to take a backup directly to an S3 bucket or to transfer files before the complete backup is done, but you can save local space by adding ‘SKIP=tar’ (disclaimer: I wrote that feature) to your gitlab-backup create command. It means that a backup will contain a lot more files, I have no idea how that affects the time it will take to transfer everything to S3.