Ssl ctx load verify file: bio lib (Could not authenticate you from Ldapmain because)

Hello,

Recently we get this error when we want to logon with a domain account:

Could not authenticate you from Ldapmain because "Ssl ctx load verify file: bio lib".

We are using the self-hosted version of gitlab-ce ( gitlab/gitlab-ce ) - currently running version 17.3 - configured to authenticate against our Active Directory. Our docker server is Oracle Linux Server release 8.10 (more or less binary compatible with rhel 8). I’m not exactly sure when the problem started to occur but must be 2 to 3 weeks ago due to the fact that a lot of employees were on summerbreak.

I asked the windows engineers if something got been change (only windows updates). The gitlab.rb inside the container was not changed. On the other hand - there is a task to get new builds every Sunday night so the only thing that might have changed is gitlab-ce itself, but I do not check every Monday if a new version was pulled.

For now, some employees can log in and use gitlab with local accounts but a lot of servers periodically pull updated repos using domain credentials.

To see if items needed updating I updated the os and re-pulled the latest image. That did not solve the issue.

Configuration

This is the composefile:

---
version: "3"
services:
  gitlabce:
    container_name: GitLab-CE
    image: gitlab/gitlab-ce:latest
    network_mode: bridge
    restart: always
    ports:
      - "9080:9080"
      - "9443:9443"
      - "9022:22"
    volumes:
      - "/home/docker/gitlab-ce/config:/etc/gitlab"
      - "/home/docker/gitlab-ce/data:/var/opt/gitlab"
      - "/home/docker/gitlab-ce/gitlab-ce/log:/var/log/gitlab"
    environment:
      TZ: Europe/Berlin
      HOST_OS: Oracle Linux 8
      HOST_HOSTNAME: myserver"
      HOST_CONTAINERNAME: GitLab-CE
      GITLAB_OMNIBUS_CONFIG: external_url 'https://myserver.local:9080/'

Ldap config in gitlab.rb (I masked a couple of things but this config is - was? - working i spinned up the container for the 1st time - januari 2024)

gitlab_rails['ldap_enabled'] = true
gitlab_rails['prevent_ldap_sign_in'] = false
gitlab_rails['ldap_servers'] = {
'main' => {
  'label' => 'LDAP',
  'host' =>  'dca.local',
  'port' => 636,
  'uid' => 'sAMAccountName',
  'encryption' => 'simple_tls',
  'verify_certificates' => false,
  'bind_dn' => 'ldap@dca.local',
  'password' => '****',
  'tls_options' => {
    'ca_file' => 'valid_root_ca.pem',
    'ssl_version' => 'SSLv23',
    'ciphers' => '',
    'cert' => '',
    'key' => ''
  },
  'timeout' => 10,
  'active_directory' => true,
  'allow_username_or_email_login' => false,
  'block_auto_created_users' => false,
  'base' => 'dc=local',
  'user_filter' => '(&(objectClass=user)(memberOf=CN=GIT_Users,DC=local))',
  'attributes' => {
    'username' => ['uid', 'userid', 'sAMAccountName'],
    'email' => ['mail', 'email', 'userPrincipalName'],
    'name' => 'cn',
    'first_name' => 'givenName',
    'last_name' => 'sn'
  },
  'lowercase_usernames' => false,

  # EE Only
  'group_base' => '',
  'admin_group' => '',
  'external_groups' => [],
  'sync_ssh_keys' => false
  }
}

Please advice
Regards
Sjoerd

Hello,

I had the same problem yesterday in my company after I upgraded Gitlab from 17.0.2 to 17.3.1.
We found that the ‘ca_file’ option in the gitlab_rails[‘ldap_servers’] configuration was pointing to a file that does not exist on the vm. We updated the value with the valid file (in our case this is a .crt file) and everything is working corretly atm.
I don’t know if the issue was caused by the upgrade (deleting the file in the process?) or if the option was not considered with the previous gitlab version.

1 Like

Indeed - the ca_file is supplied to the config was missing… Luckily I could copy it from our other server. The error remains tho. Going through the log file I noticed that gitlab thinks my account it blocked which is obviously not the case. From /var/log/gitlab/gitlab-rails/application_json.log (I masked the ip and account):

GitLab-CE  | {"severity":"ERROR","time":"2024-09-06T08:02:09.791Z","correlation_id":"01J735NQPZDMM8BGW0G9AV1GGB","message":"(ldapmain) Authentication failure! ldap_error: Net::LDAP::Error, SSL_CTX_load_verify_file: BIO lib"}
GitLab-CE  | {"severity":"INFO","time":"2024-09-06T08:02:09.873Z","correlation_id":"01J735NQPZDMM8BGW0G9AV1GGB","meta.caller_id":"OmniauthCallbacksController#failure","meta.remote_ip":"*.*.*.*","meta.feature_category":"system_access","meta.client_id":"ip/*.*.*.*","message": "Account Locked: username=myaccount"}

When I got into gitlab-ce with a local (admin) account I noticed this two tags behind “myaccount”:
image

I could easily unlock the account inside gitlab-ce but trying the relog gives the save error and after a view tries the account (inside gitlab-ce) got blocked again. I can confirm that at that point my actual LDAP account was not locked and could logon on any linux/windows server with that account.

Inside the compose log (dunno which logfile holds that) I also found :

GitLab-CE  |  Linking f3664f81.0 from /etc/gitlab/trusted-certs/*****.pem
GitLab-CE  | [2024-09-06T10:00:01+02:00] INFO: ruby_block[Move existing certs and link to /opt/gitlab/embedded/ssl/certs] called

I masked the file - but that is the same file I got in the gitlab.rb configuration file and it is linked properly.
I’m a bit stuck since the guys that provide the file saying nothing changed.

Regards
Sjoerd

I can share our gitlab_rails ldap config, maybe it could help

gitlab_rails['ldap_enabled'] = true

###! **remember to close this block with 'EOS' below**
gitlab_rails['ldap_servers'] = YAML.load <<-'EOS'
  main: # 'main' is the GitLab 'provider ID' of this LDAP server
    label: '[redacted]'
    host: '[redacted]'
    port: 389
    uid: 'sAMAccountName'
    bind_dn: 'CN=[redacted],OU=[redacted],OU=[redacted],OU=[redacted],DC=[redacted],DC=[redacted]'
    password: '[redacted]'
    encryption: 'start_tls' # "start_tls" or "simple_tls" or "plain"
    verify_certificates: false
    tls_options:
      ca_file: '/usr/local/share/ca-certificates/[redacted].crt'
      ssl_version: ''
    active_directory: true
    allow_username_or_email_login: false
    block_auto_created_users: false
    base: 'OU=[redacted],DC=[redacted],DC=[redacted]'
    user_filter: '[redacted]'
    attributes:
      username: ['uid', 'userid', 'sAMAccountName']
      email:    ['mail', 'email', 'userPrincipalName']
      name:       'cn'
      first_name: 'givenName'
      last_name:  'sn'
    ## EE only
    #group_base: ''
    #admin_group: ''
    #sync_ssh_keys: false
EOS

I can notice that “ssl_version” and “port” fields are different

I hope this could help,
Matteo

LDAPS communicates over port 636 so thats why the port (and the ssl_version) are different - On our old server this configuration still work except it’s version 13.10 and on very old hardware. This is the main reason why I’m using the dockerimage -

This setup has been running since jan 2024 without any issues but during my springbreak something changed - most likely gitlab-ce and some windows updates but can’t figure out what where the initial problem lays.

Same here after upgrading to 17.3.1 !
After some searching it was indeed a small change in the certs file internal path in the tls_options of the gitlab_rails['ldap_servers'].
It changed from :

tls_options: {
              ca_file: '/etc/gitlab/trusted_certs/ldap_ca_server.crt',
              ssl_version: 'TLSv1_2'
            }

to

tls_options: {
              ca_file: '/etc/gitlab/trusted-certs/ldap_ca_server.crt',
              ssl_version: 'TLSv1_2'
            }

I’m also wondering on whether SSLv23 has been deprecated, since this value suggests old SSLv2 or SSLv3 which shouldn’t be used at all now. I would expect at a minimum it should be using TLSv1_1.

Hi,
I checked your “from” and “to” like 20 times but I can’t see any difference. I assume you changed ssl_version so I changed it. Still does not work. Same error remains :frowning:

Edit:
Must read better :face_with_peeking_eye:
After I added the the full path in ca_file it works again. Thank your for this!!

1 Like