Gitaly.socket no such file or directory when trying to create new project


I’m trying to create a random project inside a group called ‘control’. When I do that, I get error 500. Looking in the logs I’ve come across this (gitlab-rails/production.log):

ActionView::Template::Error (14:connections to all backends failing; last error: UNKNOWN: unix:/var/opt/gitlab/gitaly/gitaly.socket: No such file or directory. debug_error_string:{UNKNOWN:connections to all backends failing; last error: UNKNOWN: unix:/var/opt/gitlab/gitaly/gitaly.socket: No such file or directory {grpc_status:14, created_time:"2023-12-04T16:46:06.461685113+00:00"}}):
    72:     %meta{ property: 'og:site_name', content: site_name }
    73:     %meta{ property: 'og:title', content: page_title }
    74:     %meta{ property: 'og:description', content: page_description }
    75:     %meta{ property: 'og:image', content: page_image }
    76:     %meta{ property: 'og:image:width', content: '64' }
    77:     %meta{ property: 'og:image:height', content: '64' }
    78:     %meta{ property: 'og:url', content: request.base_url + request.fullpath }

lib/gitlab/gitaly_client.rb:290:in `execute'
lib/gitlab/gitaly_client/call.rb:18:in `block in call'
lib/gitlab/gitaly_client/call.rb:60:in `recording_request'
lib/gitlab/gitaly_client/call.rb:17:in `call'
lib/gitlab/gitaly_client.rb:280:in `call'
lib/gitlab/gitaly_client/with_feature_flag_actors.rb:31:in `block in gitaly_client_call'
lib/gitlab/gitaly_client.rb:660:in `with_feature_flag_actors'
lib/gitlab/gitaly_client/with_feature_flag_actors.rb:25:in `gitaly_client_call'
lib/gitlab/gitaly_client/repository_service.rb:22:in `exists?'
lib/gitlab/git/repository.rb:102:in `exists?'
app/models/repository.rb:562:in `exists?'
lib/gitlab/repository_cache_adapter.rb:95:in `block (2 levels) in cache_method_asymmetrically'
lib/gitlab/repository_cache.rb:44:in `fetch_without_caching_false'
lib/gitlab/repository_cache_adapter.rb:190:in `block (2 levels) in cache_method_output_asymmetrically'
lib/gitlab/repository_cache.rb:25:in `fetch'
lib/gitlab/repository_cache_adapter.rb:189:in `block in cache_method_output_asymmetrically'
lib/gitlab/repository_cache_adapter.rb:203:in `block in memoize_method_output'
lib/gitlab/repository_cache_adapter.rb:212:in `no_repository_fallback'
lib/gitlab/repository_cache_adapter.rb:202:in `memoize_method_output'
lib/gitlab/repository_cache_adapter.rb:188:in `cache_method_output_asymmetrically'
lib/gitlab/repository_cache_adapter.rb:94:in `block in cache_method_asymmetrically'
app/models/repository.rb:570:in `empty?'
app/models/repository.rb:696:in `tree'
app/models/repository.rb:1084:in `file_on_head'
app/models/repository.rb:612:in `block in avatar'
lib/gitlab/gitaly_client.rb:455:in `allow_n_plus_1_calls'
app/models/repository.rb:611:in `avatar'
lib/gitlab/repository_cache_adapter.rb:21:in `block (2 levels) in cache_method'
lib/gitlab/repository_cache.rb:25:in `fetch'
lib/gitlab/repository_cache_adapter.rb:163:in `block in cache_method_output'
lib/gitlab/repository_cache_adapter.rb:203:in `block in memoize_method_output'
lib/gitlab/repository_cache_adapter.rb:212:in `no_repository_fallback'
lib/gitlab/repository_cache_adapter.rb:202:in `memoize_method_output'
lib/gitlab/repository_cache_adapter.rb:162:in `cache_method_output'
lib/gitlab/repository_cache_adapter.rb:20:in `block in cache_method'
app/models/project.rb:1766:in `avatar_in_git'
app/models/project.rb:1770:in `avatar_url'
app/models/concerns/avatarable.rb:36:in `avatar_url'
app/helpers/page_layout_helper.rb:62:in `page_image'
app/controllers/application_controller.rb:162:in `render'
app/controllers/projects_controller.rb:113:in `create'
ee/lib/gitlab/ip_address_state.rb:10:in `with'
ee/app/controllers/ee/application_controller.rb:45:in `set_current_ip_address'
app/controllers/application_controller.rb:498:in `set_current_admin'
lib/gitlab/session.rb:11:in `with_session'
app/controllers/application_controller.rb:489:in `set_session_storage'
lib/gitlab/i18n.rb:114:in `with_locale'
lib/gitlab/i18n.rb:120:in `with_user_locale'
app/controllers/application_controller.rb:480:in `set_locale'
app/controllers/application_controller.rb:473:in `set_current_context'
lib/gitlab/metrics/elasticsearch_rack_middleware.rb:16:in `call'
lib/gitlab/middleware/memory_report.rb:13:in `call'
lib/gitlab/middleware/speedscope.rb:13:in `call'
lib/gitlab/database/load_balancing/rack_middleware.rb:23:in `call'
lib/gitlab/middleware/rails_queue_duration.rb:33:in `call'
lib/gitlab/etag_caching/middleware.rb:21:in `call'
lib/gitlab/metrics/rack_middleware.rb:16:in `block in call'
lib/gitlab/metrics/web_transaction.rb:46:in `run'
lib/gitlab/metrics/rack_middleware.rb:16:in `call'
lib/gitlab/middleware/go.rb:20:in `call'
lib/gitlab/middleware/query_analyzer.rb:11:in `block in call'
lib/gitlab/database/query_analyzer.rb:37:in `within'
lib/gitlab/middleware/query_analyzer.rb:11:in `call'
lib/gitlab/middleware/multipart.rb:173:in `call'
lib/gitlab/middleware/read_only/controller.rb:50:in `call'
lib/gitlab/middleware/read_only.rb:18:in `call'
lib/gitlab/middleware/same_site_cookies.rb:27:in `call'
lib/gitlab/middleware/path_traversal_check.rb:48:in `call'
lib/gitlab/middleware/handle_malformed_strings.rb:21:in `call'
lib/gitlab/middleware/basic_health_check.rb:25:in `call'
lib/gitlab/middleware/handle_ip_spoof_attack_error.rb:25:in `call'
lib/gitlab/middleware/request_context.rb:15:in `call'
lib/gitlab/middleware/webhook_recursion_detection.rb:15:in `call'
config/initializers/fix_local_cache_middleware.rb:11:in `call'
lib/gitlab/middleware/compressed_json.rb:44:in `call'
lib/gitlab/middleware/rack_multipart_tempfile_factory.rb:19:in `call'
lib/gitlab/middleware/sidekiq_web_static.rb:20:in `call'
lib/gitlab/metrics/requests_rack_middleware.rb:79:in `call'
lib/gitlab/middleware/release_env.rb:13:in `call'

Looking in the gitaly logs, I keep coming across this:

unclean Gitaly shutdown: setup runtime directory: creating runtime directory: mkdir /var/opt/gitlab/gitaly/run/gitaly-6867: permission denied
{"error":"exit status 1","level":"error","msg":"waiting for supervised command","pid":6861,"time":"2023-12-04T16:49:01.990Z","wrapper":6861}
{"level":"warning","msg":"forwarding signal","pid":6861,"process":6867,"signal":17,"time":"2023-12-04T16:49:01.990Z","wrapper":6861}
{"error":"os: process already finished","level":"error","msg":"can't forward the signal","pid":6861,"process":6867,"signal":17,"time":"2023-12-04T16:49:01.990Z","wrapper":6861}

Any ideas why this is happening? The directory seems to have the right permissions:

drwx------ 2 git root 4096 Nov 21 14:41 /var/opt/gitlab/gitaly/run/

The weird thing is that I have barely used this instance at all, I’ve just upgraded every time a new version has come up and it just broke after a while.

I’m using this inside a Debian 12 virtual machine (Proxmox) and the gitlab-ee version is 16.1.1.

Any ideas how I could at least start debugging this further?

Having the same issue. The instance is completely new.
The only thing I customized is to reduce memory usage as in the documentation:
Running GitLab in a memory-constrained environment | GitLab.

1 Like

This looks like a clear bug to me. This is the third time something as bad as that has happened (in my case). Last two were in production. For the first one I needed to rollback (but I heard about a lot of disasters for companies who didn’t back up their instances) and the second time I had to make manual changes in the database, because I didn’t see that something was wrong with the gitlab, as it didn’t report anything weird after the upgrade, so only after a month or so did I realise that something was wrong (the ‘users’ sections was throwing 500 errors (missing indices: Column `ci_sources_pipelines` indexes do not exist and upgrade to 16.5 fails (#430817) · Issues · / GitLab · GitLab).

I was actually able to fix my problem. As I said, I used the link above to optimize my instance to run on a low memory environment. The section to optimze gitaly seems to have caused the issue, since I was able to resolve it by removing those settings again. I don’t know what exactly the problem with these settings was, but my instance runs fine without them on 4 GB RAM.
Maybe someone else will be able to tell, what’s wrong there.

1 Like

From your previous post I concluded that, despite the fact that you’ve reduced the memory usage, gitlab still didn’t work as expected. So the wording was a little bit misleading :slight_smile:

I was running gitlab with 3 GB of RAM (2 GB of swap) and 2 cores. Now I’ve added another 1 GB of RAM, I’ve reboot it. Same thing.

Then I added a another 2 GB (6 GB in total), because I saw that there was still not enough available (~327 MB left), still the same. It’s also ridiculous to just have to add resources without an proper error from gitlab.

As per the memory-constrained docs, as well as Gitlabs normal hardware specs, 4cpu is the absolute minimum, even in memory constrained specs with ARM. If you have 2cpu like you mention, then that is most likely why issues are sitll being experienced. It may work with less, but not guaranteed - assuming all unnecessary services were dropped, and config amended to deal with the lack of cpus available to the system.

There are a lot of background processes that run, and scheduled ones too that increase as newer versions of Gitlab are released. The lack of resources, will cause these processes to run longer to complete their tasks. In some cases you may find hundreds of jobs/tasks waiting to be ran.

I’m sorry if I confused you ^^
I run on 4 CPUs and 4GB RAM. Had the issue, that the server froze frequently, because the memory maxxed out. When I started applying the settings to reduce memory usage, the error started appearing, seemingly because of the gitaly settings, as mentioned above.
Now with the settings (except the ones for gitaly) applied I have a memory usage of 3GB on idle and the instance is running fine, no more freezes.

1 Like

So why isn’t that documented anywhere? The minimum requirement is 1 CPU core of AMD64, which is admittely low (as shown in the link). And worse, why does gitaly is reported as having started without any issues, but actually not work? This looks like a terrible oversight to me (or simply really bad planning).
Anyway, 4 cores for a gitlab-instance that gets basically no requests at all tells me someone isn’t worried at all about writing bad software, to be honest.

I’ve added 4 cores to the virtual machine. I’ve rebooted. I get the exact same error.

@Tosty No worries, thanks for the reference anyway. Your behaviour is clearly different than mine. gitlab doesn’t feel slow, but given that core functionalities don’t work, it’s relatively irrelevant.

You would need to post more information here, such as what exact configuration changes you made in /etc/gitlab/gitlab.rb that may cause the situation that you are experiencing. If you are still having problems with gitaly, then that would suggest something isn’t configured properly. I would revert any changes made to gitaly to the defaults, and then just make changes one-by-one as that way is easier to find out what caused it. Obviously assuming if you went through the entire memory restricted docs and applied all changes at the same time since then it’s not as easy to figure out what stopped it from working.

These are the only changes that I’ve made since initially setting up gitlab (or at least this is what is uncommented):
The puma settings I set aroun 15 minutes ago to restrict them further, so I don’t think that’s an issue.

external_url ''
registry_external_url ''
puma['min_threads'] = 2
puma['max_threads'] = 2
nginx['enable'] = true
nginx['ssl_certificate'] = "/etc/gitlab/ssl/pveproxy-ssl.pem"
nginx['ssl_certificate_key'] = "/etc/gitlab/ssl/pveproxy-ssl.key"
nginx['ssl_protocols'] = "TLSv1.2 TLSv1.3"
registry_nginx['enable'] = false
registry_nginx['ssl_certificate'] = "/etc/gitlab/ssl/pveproxy-ssl.pem"
registry_nginx['ssl_certificate_key'] = "/etc/gitlab/ssl/pveproxy-ssl.key"
alertmanager['enable'] = false
postgres_exporter['enable'] = true
letsencrypt['enable'] = false
letsencrypt['auto_renew'] = false

What I did change recently was the port for the URLs, I set it to 8851. Reverting to 443 hasn’t made any difference.

So there’s not a lot to revert really and there aren’t any changes related to gitaly that I’ve made.

What I think should be the focus is this:

unclean Gitaly shutdown: setup runtime directory: creating runtime directory: mkdir /var/opt/gitlab/gitaly/run/gitaly-6867: permission denied

From a unix permissions perspective, this doesn’t make any sense at all.

Can you check the permissions to that gitaly/run directory:

root@gitlab:~# ls -lha /var/opt/gitlab/gitaly | grep run
drwx------  3 git  root 4.0K Dec 17 17:20 run


never mind, I see you have the same permissions.

Do any of these commands help for debugging Gitaly?

Can confirm same behaviour here.