I’m moving from a local environment to a local/cloud one where it’s distributed as follows. Locally I have a K3S where I’m running a Redis Server, different Gitlab-Runners and gitlab-ee image (latest) that is pointing to an external database (AWS) and has the data in a local Volume.
The problem that I’m facing is that when I’m using the external database i have something like this:
Where it’s clear that there are more queries and the time is high compared than doing it without an external database:
The gitlab.rb that I’m using is:
external_url 'url.com'
gitlab_rails['smtp_enable'] = true
gitlab_rails['smtp_address'] = "smtp.mailersend.net"
gitlab_rails['smtp_port'] = 587
gitlab_rails['smtp_user_name'] = "mail"
gitlab_rails['smtp_password'] = "password"
gitlab_rails['smtp_domain'] = "domain"
gitlab_rails['smtp_authentication'] = "login"
gitlab_rails['smtp_enable_starttls_auto'] = true
gitlab_rails['smtp_tls'] = false
gitlab_rails['smtp_pool'] = false
gitlab_rails['backup_keep_time'] = 604800
gitlab_rails['backup_upload_connection'] = {
'provider' => 'AWS',
'region' => 'reg',
'aws_access_key_id' => 'key',
'aws_secret_access_key' => 'secret-key',
...
}
gitlab_rails['auto_migrate'] = false
postgresql['auto_restart_on_version_change'] = false
geo_postgresql['auto_restart_on_version_change'] = false
postgresql['enable'] = false
gitlab_rails['db_adapter'] = "postgresql"
gitlab_rails['db_database'] = "..."
gitlab_rails['db_username'] = "..."
gitlab_rails['db_password'] = "..."
gitlab_rails['db_host'] = "..."
gitlab_rails['db_port'] = "..."
gitlab_rails['redis_host'] = "..."
gitlab_rails['redis_port'] = ...
puma['worker_processes'] = 0
sidekiq['max_concurrency'] = 10
prometheus_monitoring['enable'] = false
gitlab_rails['env'] = {
'MALLOC_CONF' => 'dirty_decay_ms:1000,muzzy_decay_ms:1000'
}
I’ve tried also to upgrade the requirements of the external database and it’s still like in the first screenshot. Also, I have other services (not GitLab) that works perfectly.
Thanks!