Git LFS upload throws 500 after upload

Hi,

I’ve a strange problem with Git LFS I cannot push any objects. The Git LFS upload will always try restarting restarting the upload.

Here is some error logging

gitlfs-test git:master ❯ GIT_TRACE=1 git lfs push origin master --all
11:07:33.539155 git.c:576               trace: exec: git-lfs push origin master --all
11:07:33.539802 run-command.c:640       trace: run_command: git-lfs push origin master --all
11:07:33.580633 trace git-lfs: exec: git 'version'
11:07:33.605641 trace git-lfs: exec: git '-c' 'filter.lfs.smudge=' '-c' 'filter.lfs.clean=' '-c' 'filter.lfs.process=' '-c' 'filter.lfs.required=false' 'rev-parse' 'HEAD' '--symbolic-full-name' 'HEAD'
11:07:33.638064 trace git-lfs: exec: git 'config' '-l'
11:07:33.648706 trace git-lfs: Upload refs [master] to remote origin
11:07:34.160690 trace git-lfs: creds: git credential fill ("https", "gitlab.localserver.com", "")
11:07:34.213039 trace git-lfs: Filled credentials for https://gitlab.localserver.com/jan/gitlfs-test.git
11:07:34.285236 trace git-lfs: HTTP: POST https://gitlab.localserver.com/jan/gitlfs-test.git/info/lfs/locks/verify
11:07:34.986646 trace git-lfs: HTTP: 200
11:07:34.986748 trace git-lfs: creds: git credential approve ("https", "gitlab.localserver.com", "")
11:07:35.056720 trace git-lfs: HTTP: {"ours":[],"theirs":[]}
11:07:35.057018 trace git-lfs: tq: running as batched queue, batch size of 100
11:07:35.057634 trace git-lfs: run_command: git rev-list --stdin --objects --do-walk --
11:07:35.066997 trace git-lfs: run_command: git cat-file --batch
11:07:35.091847 trace git-lfs: tq: sending batch of size 2
11:07:35.093316 trace git-lfs: api: batch 2 files
11:07:35.093502 trace git-lfs: creds: git credential cache ("https", "gitlab.localserver.com", "")
11:07:35.093534 trace git-lfs: Filled credentials for https://gitlab.localserver.com/jan/gitlfs-test.git
11:07:35.093559 trace git-lfs: HTTP: POST https://gitlab.localserver.com/jan/gitlfs-test.git/info/lfs/objects/batch
11:07:35.500657 trace git-lfs: HTTP: 200
11:07:35.501207 trace git-lfs: HTTP: {"objects":[{"oid":"6e10032662306bcdede4a9e579dac3b101083d3f3b7a1b5a0061b8c2ed34e5e1","size":26849310,"actions":{"upload":{"href":"https://gitlab.localserver.com/jan/gitlfs-test.git/gitlab-lfs/objects/6e10032662306bcdede4a9e579dac3b101083d3f3b7a1b5a0061b8c2ed34e5e1/26849310","header":{"Authorization":"Basic amFuOnRydXN0bjAjMQ=="}}}},{"oid":"634e5a00fe91640c3d003cbea49c0d7d9daaad7d821bc85d8bea4ecb37f500e2","size":1542779,"actions":{"upload":{"href":"https://gitlab.localserver.com/jan/gitlfs-test.git/gitlab-lfs/obj
11:07:35.501368 trace git-lfs: HTTP: ects/634e5a00fe91640c3d003cbea49c0d7d9daaad7d821bc85d8bea4ecb37f500e2/1542779","header":{"Authorization":"Basic amFuOnRydXN0bjAjMQ=="}}}}]}
11:07:35.501815 trace git-lfs: tq: starting transfer adapter "basic"
11:07:35.502379 trace git-lfs: HTTP: PUT https://gitlab.localserver.com/jan/gitlfs-test.git/gitlab-lfs/objects/6e10032662306bcdede4a9e579dac3b101083d3f3b7a1b5a0061b8c2ed34e5e1/26849310
11:07:35.545229 trace git-lfs: HTTP: PUT https://gitlab.localserver.com/jan/gitlfs-test.git/gitlab-lfs/objects/634e5a00fe91640c3d003cbea49c0d7d9daaad7d821bc85d8bea4ecb37f500e2/1542779
11:07:40.588603 trace git-lfs: HTTP: 500 | 1.9 MB/s
11:07:40.589057 trace git-lfs: tq: retrying object 634e5a00fe91640c3d003cbea49c0d7d9daaad7d821bc85d8bea4ecb37f500e2: Fatal error: Server error: https://gitlab.localserver.com/jan/gitlfs-test.git/gitlab-lfs/objects/634e5a00fe91640c3d003cbea49c0d7d9daaad7d821bc85d8bea4ecb37f500e2/1542779
11:07:40.589131 trace git-lfs: tq: enqueue retry #1 for "634e5a00fe91640c3d003cbea49c0d7d9daaad7d821bc85d8bea4ecb37f500e2" (size: 1542779)
11:07:48.562617 trace git-lfs: HTTP: 500 | 2.0 MB/s
11:07:48.562891 trace git-lfs: tq: retrying object 6e10032662306bcdede4a9e579dac3b101083d3f3b7a1b5a0061b8c2ed34e5e1: Fatal error: Server error: https://gitlab.localserver.com/jan/gitlfs-test.git/gitlab-lfs/objects/6e10032662306bcdede4a9e579dac3b101083d3f3b7a1b5a0061b8c2ed34e5e1/26849310
11:07:48.562953 trace git-lfs: tq: enqueue retry #1 for "6e10032662306bcdede4a9e579dac3b101083d3f3b7a1b5a0061b8c2ed34e5e1" (size: 26849310)
11:07:48.563030 trace git-lfs: tq: sending batch of size 2
11:07:48.563204 trace git-lfs: api: batch 2 files
11:07:48.563349 trace git-lfs: creds: git credential cache ("https", "gitlab.localserver.com", "")
11:07:48.563366 trace git-lfs: Filled credentials for https://gitlab.localserver.com/jan/gitlfs-test.git
11:07:48.563397 trace git-lfs: HTTP: POST https://gitlab.localserver.com/jan/gitlfs-test.git/info/lfs/objects/batch
11:07:49.019914 trace git-lfs: HTTP: 200
11:07:49.020048 trace git-lfs: HTTP: {"objects":[{"oid":"6e10032662306bcdede4a9e579dac3b101083d3f3b7a1b5a0061b8c2ed34e5e1","size":26849310,"actions":{"upload":{"href":"https://gitlab.localserver.com/jan/gitlfs-test.git/gitlab-lfs/objects/6e10032662306bcdede4a9e579dac3b101083d3f3b7a1b5a0061b8c2ed34e5e1/26849310","header":{"Authorization":"Basic amFuOnRydXN0bjAjMQ=="}}}},{"oid":"634e5a00fe91640c3d003cbea49c0d7d9daaad7d821bc85d8bea4ecb37f500e2","size":1542779,"actions":{"upload":{"href":"https://gitlab.localserver.com/jan/gitlfs-test.git/gitlab-lfs/obj
11:07:49.020130 trace git-lfs: HTTP: ects/634e5a00fe91640c3d003cbea49c0d7d9daaad7d821bc85d8bea4ecb37f500e2/1542779","header":{"Authorization":"Basic amFuOnRydXN0bjAjMQ=="}}}}]}
11:07:49.020718 trace git-lfs: tq: starting transfer adapter "basic"
11:07:49.021120 trace git-lfs: HTTP: PUT https://gitlab.localserver.com/jan/gitlfs-test.git/gitlab-lfs/objects/6e10032662306bcdede4a9e579dac3b101083d3f3b7a1b5a0061b8c2ed34e5e1/26849310
11:07:49.061828 trace git-lfs: HTTP: PUT https://gitlab.localserver.com/jan/gitlfs-test.git/gitlab-lfs/objects/634e5a00fe91640c3d003cbea49c0d7d9daaad7d821bc85d8bea4ecb37f500e2/1542779
11:07:50.765657 trace git-lfs: HTTP: 500B | 204 KB/s
11:07:50.765806 trace git-lfs: tq: retrying object 634e5a00fe91640c3d003cbea49c0d7d9daaad7d821bc85d8bea4ecb37f500e2: Fatal error: Server error: https://gitlab.localserver.com/jan/gitlfs-test.git/gitlab-lfs/objects/634e5a00fe91640c3d003cbea49c0d7d9daaad7d821bc85d8bea4ecb37f500e2/1542779
11:07:50.765887 trace git-lfs: tq: enqueue retry #2 for "634e5a00fe91640c3d003cbea49c0d7d9daaad7d821bc85d8bea4ecb37f500e2" (size: 1542779)
^C11:07:50.819479 trace git-lfs: filepathfilter: rewrite ".git" as "**/.git/**"
11:07:50.819501 trace git-lfs: filepathfilter: rewrite "**/.git" as "**/.git"
11:07:50.819546 trace git-lfs: filepathfilter: rejecting "tmp" via []
11:07:50.819559 trace git-lfs: filepathfilter: accepting "tmp"

Exiting because of "interrupt" signal.

In the Admin-Panel of Gitlab I found another error (production.log)

Started PUT "/jan/gitlfs-test.git/gitlab-lfs/objects/6e10032662306bcdede4a9e579dac3b101083d3f3b7a1b5a0061b8c2ed34e5e1/26849310" for 80.81.27.186 at 2018-04-16 10:11:37 +020
Processing by Projects::LfsStorageController#upload_finalize as HTM
Parameters: {"namespace_id"=>"jan", "project_id"=>"gitlfs-test.git", "oid"=>"6e10032662306bcdede4a9e579dac3b101083d3f3b7a1b5a0061b8c2ed34e5e1", "size"=>"26849310"
Completed 500 Internal Server Error in 92ms (ActiveRecord: 4.8ms
NoMethodError (undefined method `include?' for nil:NilClass)
app/controllers/projects/lfs_storage_controller.rb:55:in `tmp_filename
app/controllers/projects/lfs_storage_controller.rb:23:in `upload_finalize
lib/gitlab/i18n.rb:50:in `with_locale
lib/gitlab/i18n.rb:56:in `with_user_locale
app/controllers/application_controller.rb:330:in `set_locale
lib/gitlab/middleware/multipart.rb:95:in `call
lib/gitlab/request_profiler/middleware.rb:14:in `call
lib/gitlab/middleware/go.rb:17:in `call
lib/gitlab/etag_caching/middleware.rb:11:in `call
lib/gitlab/middleware/read_only/controller.rb:28:in `call
lib/gitlab/middleware/read_only.rb:16:in `call
lib/gitlab/request_context.rb:18:in `call
lib/gitlab/metrics/requests_rack_middleware.rb:27:in `call
lib/gitlab/middleware/release_env.rb:10:in `call
Started POST "/jan/gitlfs-test.git/info/lfs/objects/batch" for 80.81.27.186 at 2018-04-16 10:11:37 +020
Processing by Projects::LfsApiController#batch as JSO
Parameters: {"operation"=>"upload", "objects"=>[{"oid"=>"6e10032662306bcdede4a9e579dac3b101083d3f3b7a1b5a0061b8c2ed34e5e1", "size"=>26849310}, {"oid"=>"634e5a00fe91640c3d003cbea49c0d7d9daaad7d821bc85d8bea4ecb37f500e2", "size"=>1542779}], "ref"=>{"name"=>"refs/heads/master"}, "namespace_id"=>"jan", "project_id"=>"gitlfs-test.git", "lfs_api"=>{"operation"=>"upload", "objects"=>[{"oid"=>"6e10032662306bcdede4a9e579dac3b101083d3f3b7a1b5a0061b8c2ed34e5e1", "size"=>26849310}, {"oid"=>"634e5a00fe91640c3d003cbea49c0d7d9daaad7d821bc85d8bea4ecb37f500e2", "size"=>1542779}], "ref"=>{"name"=>"refs/heads/master"}}
Completed 200 OK in 107ms (Views: 0.7ms | ActiveRecord: 7.6ms
Started GET "/-/metrics" for 127.0.0.1 at 2018-04-16 10:11:42 +020
Processing by MetricsController#index as HTM

I cannot find any other error. The rest of my system works perfect. Only Git LFS is not working.

System Info:
GitLab 10.6.3
GitLab Shell 6.0.4
GitLab Workhorse 4.0.0
GitLab API v4
Ruby 2.3.6p384
Rails 4.2.10
postgresql 9.6.8
git 2.7.4

I enabled Git LFS only in Gitlab. I never installed Git LFS on my server manuelly, Git LFS ist installed on my client. Git LFS is also enabled in Project Settings.

Gitlab.rb LFS part:
### Git LFS
gitlab_rails[‘lfs_enabled’] = true
gitlab_rails[‘lfs_storage_path’] = “/var/opt/gitlab/gitlab-rails/shared/lfs-objects”
# gitlab_rails[‘lfs_object_store_enabled’] = false # EE only
# gitlab_rails[‘lfs_object_store_background_upload’] = true
# gitlab_rails[‘lfs_object_store_remote_directory’] = “lfs-objects”
# gitlab_rails[‘lfs_object_store_connection’] = {
# ‘provider’ => ‘AWS’,
# ‘region’ => ‘eu-west-1’,
# ‘aws_access_key_id’ => ‘AWS_ACCESS_KEY_ID’,
# ‘aws_secret_access_key’ => ‘AWS_SECRET_ACCESS_KEY’,
# # # The below options configure an S3 compatible host instead of AWS
# # ‘host’ => ‘s3.amazonaws.com’,
# # ‘endpoint’ => nil,
# # ‘path_style’ => false # Use ‘host/bucket_name/object’ instead of ‘bucket_name.host/object’
# }

Some tips for troubleshooting would be great.

so long
jd

1 Like

After a while I switched my installation to the bundled Nginx and now it works as expected. I’ve no Idea why my existing Nginx has any problems.

Bump, reproduced on Apache

I’ve reproduced this by just running gitlab as a local docker container and trying to push through localhost. Any information on how to resolve this?

I have just run into this problem, trying to create a Gitlab instance using Docker on a Windows machine (using WSL2 backend) in our local network. Same thing also happens if I just install it on my local machine with localhost as the hostname.

It’s apparently a HTTP 500 error, in the Docker logs I see there is an error here:

2024-01-22 14:50:24 Errno::ENOENT (No such file or directory @ apply2files - /mnt/storage/lfs-objects/tmp/work/1705906224-1631-0001-8206/upload):
2024-01-22 14:50:24   
2024-01-22 14:50:24 app/uploaders/object_storage.rb:430:in `cache!'
2024-01-22 14:50:24 app/controllers/repositories/lfs_storage_controller.rb:104:in `create_file!'
2024-01-22 14:50:24 app/controllers/repositories/lfs_storage_controller.rb:92:in `store_file!'
2024-01-22 14:50:24 app/controllers/repositories/lfs_storage_controller.rb:48:in `upload_finalize'
2024-01-22 14:50:24 app/controllers/repositories/git_http_client_controller.rb:149:in `bypass_admin_mode!'
2024-01-22 14:50:24 app/controllers/application_controller.rb:468:in `set_current_admin'
2024-01-22 14:50:24 lib/gitlab/i18n.rb:114:in `with_locale'
2024-01-22 14:50:24 app/controllers/application_controller.rb:452:in `set_locale'
2024-01-22 14:50:24 app/controllers/application_controller.rb:443:in `set_current_context'
2024-01-22 14:50:24 lib/gitlab/metrics/elasticsearch_rack_middleware.rb:16:in `call'
2024-01-22 14:50:24 lib/gitlab/middleware/memory_report.rb:13:in `call'
2024-01-22 14:50:24 lib/gitlab/middleware/speedscope.rb:13:in `call'
2024-01-22 14:50:24 lib/gitlab/database/load_balancing/rack_middleware.rb:23:in `call'
2024-01-22 14:50:24 lib/gitlab/middleware/rails_queue_duration.rb:33:in `call'
2024-01-22 14:50:24 lib/gitlab/etag_caching/middleware.rb:21:in `call'
2024-01-22 14:50:24 lib/gitlab/metrics/rack_middleware.rb:16:in `block in call'
2024-01-22 14:50:24 lib/gitlab/metrics/web_transaction.rb:46:in `run'
2024-01-22 14:50:24 lib/gitlab/metrics/rack_middleware.rb:16:in `call'
2024-01-22 14:50:24 lib/gitlab/middleware/go.rb:20:in `call'
2024-01-22 14:50:24 lib/gitlab/middleware/query_analyzer.rb:11:in `block in call'
2024-01-22 14:50:24 lib/gitlab/database/query_analyzer.rb:37:in `within'
2024-01-22 14:50:24 lib/gitlab/middleware/query_analyzer.rb:11:in `call'
2024-01-22 14:50:24 lib/gitlab/middleware/multipart.rb:178:in `block in call'
2024-01-22 14:50:24 lib/gitlab/middleware/multipart.rb:63:in `with_open_files'
2024-01-22 14:50:24 lib/gitlab/middleware/multipart.rb:177:in `call'
2024-01-22 14:50:24 lib/gitlab/middleware/read_only/controller.rb:50:in `call'
2024-01-22 14:50:24 lib/gitlab/middleware/read_only.rb:18:in `call'
2024-01-22 14:50:24 lib/gitlab/middleware/unauthenticated_session_expiry.rb:18:in `call'
2024-01-22 14:50:24 lib/gitlab/middleware/same_site_cookies.rb:27:in `call'
2024-01-22 14:50:24 lib/gitlab/middleware/path_traversal_check.rb:35:in `call'
2024-01-22 14:50:24 lib/gitlab/middleware/handle_malformed_strings.rb:21:in `call'
2024-01-22 14:50:24 lib/gitlab/middleware/basic_health_check.rb:25:in `call'
2024-01-22 14:50:24 lib/gitlab/middleware/handle_ip_spoof_attack_error.rb:25:in `call'
2024-01-22 14:50:24 lib/gitlab/middleware/request_context.rb:15:in `call'
2024-01-22 14:50:24 lib/gitlab/middleware/webhook_recursion_detection.rb:15:in `call'

Any solutions/ideas?