GitLab S3 backups upload issue

By reviewing the backups settings of my GitLab community edition 18.2.X (omnibus instance) installed in a EC2 on AWS. The backups files are accumulating uncontrollably fully the storage of the server. By running a manual backup, I could see in the standard output a problem related with performance (not related to configuration). The backups files have grown to approximately ~9GB. The GitLab’s native uploader (Fog) maybe is experiencing TimeOuts when attempting to send them to a S3 bucket. So, when the upload fails, the process stops before executing the file cleanup step. That’s why they are accumulating uncontrollably:

2025-12-17 14:50:38 UTC – Uploading backup archive to remote storage s3-backup …

2025-12-17 14:51:24 UTC – Deleting tar staging files …

2025-12-17 14:51:24 UTC – Cleaning up /var/opt/gitlab/backups/backup_information.yml

2025-12-17 14:51:24 UTC – Cleaning up /var/opt/gitlab/backups/db

2025-12-17 14:51:25 UTC – Cleaning up /var/opt/gitlab/backups/repositories

2025-12-17 14:51:40 UTC – Cleaning up /var/opt/gitlab/backups/uploads.tar.gz

2025-12-17 14:51:40 UTC – Cleaning up /var/opt/gitlab/backups/builds.tar.gz

2025-12-17 14:51:40 UTC – Cleaning up /var/opt/gitlab/backups/pages.tar.gz

2025-12-17 14:51:40 UTC – Cleaning up /var/opt/gitlab/backups/1fs.tar.gz

2025-12-17 14:51:40 UTC – Cleaning up /var/opt/gitlab/backups/terraform_state.tar.gz

2025-12-17 14:51:40 UTC – Cleaning up /var/opt/gitlab/backups/packages.tar.gz

2025-12-17 14:51:40 UTC – Cleaning up /var/opt/gitlab/backups/ci_secure_files.tar.gz

2025-12-17 14:51:40 UTC – Cleaning up /var/opt/gitlab/backups/external_diffs.tar.gz

2025-12-17 14:51:40 UTC – Deleting tar staging files … done

2025-12-17 14:51:40 UTC – Deleting backups/tmp …

2025-12-17 14:51:40 UTC – Deleting backups/tmp … done

2025-12-17 14:51:40 UTC – Deleting backup and restore PID file at [/opt/gitlab/embedded/service/gitlab-rails/tmp/backup_restore.pid] … done

rake aborted!

Excon::Error::BadRequest: Expected(200) <=> Actual(400 Bad Request)

excon.error.response

Is there a way to configure gitlab to avoid this behavior? I wish to keep 7 days backups locally (on the server) and 30 days in the S3 bucket.

Steps to reproduce

/opt/gitlab/bin/gitlab-backup create SKIP=artifacts

Configuration

gitlab_rails['backup_keep_time'] = 2592000
gitlab_rails['backup_upload_connection'] = {
   'provider' => 'AWS',
   'region' => 'us-east-1',
   'use_iam_profile' => true
 }
gitlab_rails['backup_upload_remote_directory'] = 's3-backup'
# gitlab_rails['backup_multipart_chunk_size'] = 104857600
# gitlab_rails['backup_storage_class'] = 'STANDARD'

Versions

Please add an x whether options apply, and add the version information.

  • X Self-managed
  • GitLab.com SaaS
  • Dedicated

Version

  • GitLab 18.2.5 Community Edition

I believe this parameter in /etc/gitlab/gitlab.rb should help you:

###! The duration in seconds to keep backups before they are allowed to be deleted
gitlab_rails['backup_keep_time'] = 2592000

that’s in seconds, and equates to 30 days. I believe setting that to 604800 and then reconfiguring Gitlab should sort that out.

Yes, but that not solve the issue that I have. This is the first step to achieve my needs, but Fog fails when try to upload the resulting tar file to the s3 bucket.

Correct, it only solves half of the problem. Since you have two issues, the first being the problem with S3, and the second problem to control how long it keeps the backup locally on the disk.

Since I don’t use S3, I cannot help out with that, but replied to provide you with the appropriate configuration option to ensure you control how long you keep backups locally.

I would check the Gitlab documentation, since Gitaly can be configured for S3, and you can then do the backup directly to S3 without having to dump it locally to disk first. It may be a potential solution to do it that way. Based on the excerpts of configuration provided so far, would assume you are not using the Gitaly S3 configuration. You can find some useful information on that in this thread: Configuring gitaly for server-side repository backups to S3 compatible storage

The reason to keep 7 days of backup locally on the server is to speed up a possible future restore operation and respond faster to a DRP operation.

But Gitaly use Fog too to make the upload operation? If so, maybe I can fall into the same situation mentioned. I read the GitLab documentation about to split the backup in chunks but not specify if that option only affects the upload only or the locally tar file too.

I asked Claude, and it suggested to enable debug logging.


gitlab_rails['backup_upload_connection'] = {
  'provider' => 'AWS',
  'region' => 'us-east-1',
  'use_iam_profile' => true,
  'enable_signature_v4' => true
}
# Add to debug S3 requests
gitlab_rails['env'] = {
  'FOG_DEBUG' => 'true'
}

Then run: gitlab-ctl reconfigure && /opt/gitlab/bin/gitlab-backup create SKIP=artifacts and check /var/log/gitlab/gitlab-rails/production.log for the actual S3 error response.

I’ve found that setting it slightly below (I think we use “3 days - 1 hour”) is a good idea, especially if you think of the value in days.

The deletion only happens at the end of jobs, but if a job takes shorter than the job yesterday, 86400 (number of seconds in 24 hours) seconds have not passed since the job yesterday finished so that is not deleted. (Adjust the value and number of days accordingly).

(And it doesn’t address the real problem the slightest, but I felt like sharing it)

I’ve tried that without success, but I executed the command again with --trace option and I can see these:

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/excon-0.99.0/lib/excon/middlewares/expects.rb:13:in `response_call’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/excon-0.99.0/lib/excon/middlewares/response_parser.rb:12:in `response_call’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/excon-0.99.0/lib/excon/connection.rb:459:in `response’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/excon-0.99.0/lib/excon/connection.rb:290:in `request’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/fog-core-2.1.0/lib/fog/core/connection.rb:81:in `request’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/fog-xml-0.1.5/lib/fog/xml/connection.rb:9:in `request’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/fog-aws-3.27.0/lib/fog/aws/storage.rb:686:in `_request’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/fog-aws-3.27.0/lib/fog/aws/storage.rb:681:in `request’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/fog-aws-3.27.0/lib/fog/aws/requests/storage/upload_part.rb:25:in `upload_part’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/fog-aws-3.27.0/lib/fog/aws/models/storage/file.rb:339:in `multipart_save’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/fog-aws-3.27.0/lib/fog/aws/models/storage/file.rb:280:in `save’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/fog-core-2.1.0/lib/fog/core/collection.rb:50:in `create’

/opt/gitlab/embedded/service/gitlab-rails/lib/backup/remote_storage.rb:26:in `upload’

/opt/gitlab/embedded/service/gitlab-rails/lib/backup/manager.rb:279:in `upload’

/opt/gitlab/embedded/service/gitlab-rails/lib/backup/manager.rb:151:in `run_all_create_tasks’

/opt/gitlab/embedded/service/gitlab-rails/lib/backup/manager.rb:32:in `create’

/opt/gitlab/embedded/service/gitlab-rails/lib/tasks/gitlab/backup.rb:12:in `block in create_backup’

/opt/gitlab/embedded/service/gitlab-rails/lib/tasks/gitlab/backup.rb:83:in `lock_backup’

/opt/gitlab/embedded/service/gitlab-rails/lib/tasks/gitlab/backup.rb:10:in `create_backup’

/opt/gitlab/embedded/service/gitlab-rails/lib/tasks/gitlab/backup.rake:8:in `block (3 levels) in <top (required)>’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/rake-13.0.6/lib/rake/task.rb:281:in `block in execute’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/rake-13.0.6/lib/rake/task.rb:281:in `each’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/rake-13.0.6/lib/rake/task.rb:281:in `execute’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/rake-13.0.6/lib/rake/task.rb:219:in `block in invoke_with_call_chain’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/rake-13.0.6/lib/rake/task.rb:199:in `synchronize’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/rake-13.0.6/lib/rake/task.rb:199:in `invoke_with_call_chain’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/rake-13.0.6/lib/rake/task.rb:188:in `invoke’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/rake-13.0.6/lib/rake/application.rb:160:in `invoke_task’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/rake-13.0.6/lib/rake/application.rb:116:in `block (2 levels) in top_level’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/rake-13.0.6/lib/rake/application.rb:116:in `each’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/rake-13.0.6/lib/rake/application.rb:116:in `block in top_level’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/rake-13.0.6/lib/rake/application.rb:125:in `run_with_threads’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/rake-13.0.6/lib/rake/application.rb:110:in `top_level’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/rake-13.0.6/lib/rake/application.rb:83:in `block in run’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/rake-13.0.6/lib/rake/application.rb:186:in `standard_exception_handling’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/rake-13.0.6/lib/rake/application.rb:80:in `run’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/rake-13.0.6/exe/rake:27:in `<top (required)>’

/opt/gitlab/embedded/bin/rake:25:in `load’

/opt/gitlab/embedded/bin/rake:25:in `<top (required)>’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/bundler-2.6.5/lib/bundler/cli/exec.rb:59:in `load’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/bundler-2.6.5/lib/bundler/cli/exec.rb:59:in `kernel_load’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/bundler-2.6.5/lib/bundler/cli/exec.rb:23:in `run’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/bundler-2.6.5/lib/bundler/cli.rb:452:in `exec’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/bundler-2.6.5/lib/bundler/vendor/thor/lib/thor/command.rb:28:in `run’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/bundler-2.6.5/lib/bundler/vendor/thor/lib/thor/invocation.rb:127:in `invoke_command’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/bundler-2.6.5/lib/bundler/vendor/thor/lib/thor.rb:538:in `dispatch’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/bundler-2.6.5/lib/bundler/cli.rb:35:in `dispatch’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/bundler-2.6.5/lib/bundler/vendor/thor/lib/thor/base.rb:584:in `start’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/bundler-2.6.5/lib/bundler/cli.rb:29:in `start’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/bundler-2.6.5/exe/bundle:28:in `block in <top (required)>’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/bundler-2.6.5/lib/bundler/friendly_errors.rb:117:in `with_friendly_errors’

/opt/gitlab/embedded/lib/ruby/gems/3.2.0/gems/bundler-2.6.5/exe/bundle:20:in `<top (required)>’

/opt/gitlab/embedded/bin/bundle:25:in `load’

/opt/gitlab/embedded/bin/bundle:25:in `’