Git: SSL certificate problem: unable to get local issuer certificate

Since 1 hr, gitlab pipelines are failing at git repo cloning or reinitializing.

Below is the output.

Getting source from Git repository00:00

[14](************/-/jobs/2431654858#L14)Fetching changes with git depth set to 20...

[15](************/-/jobs/2431654858#L15)Reinitialized existing Git repository in /builds/************/.git/

[16](**********n/-/jobs/2431654858#L16)fatal: unable to access 'https://*****************.git/': SSL certificate problem: unable to get local issuer certificate

[18](************/-/jobs/2431654858#L18)Cleaning up project directory and file based variables00:01

[20](************/-/jobs/2431654858#L20)ERROR: Job failed: exit code 1

same problem, even if I change the runner to a new ec2 instance/our exisiting dev runners.

Same error as you. For some reason, the error resolved itself for some of the runners, but still happens on some others. This seems to be pretty random. We noticed that Gitlab’s certificate on will expire in 2 days. Maybe they forgot to renew their certificates? :person_shrugging:


We are experiencing the same error, all our builds are failing now.


All builds ran fine until about 30 minutes ago, and then started tripping on this error.

I’m seeing similar issues with the OIDC provider and the root CA being switched 2 times in the past week. This is really frustrating as this has broken our deployment authentication with AWS twice due to no notification of changes or ability to get ahead of the changes.

Having a common certificate authority or notification that a certificate authority is changing is a requirement for GitLab as OpenID Connect identity provider | GitLab and Creating OpenID Connect (OIDC) identity providers - AWS Identity and Access Management

Original CA: Usertrust 2028
Weekend CA: Usertrust 2038
New CA: Baltimore CyberTrust Root 2025

1 Like

Same here. Of course manually cloning repos is ok.

It has been happening to us since two hours ago. It was intermittent at the beginning but turning consistent.

This should be taken as a top severity issue.

We have the same issue here. All our pipelines are failing :cry:

Just update system package with CA certs or pull container it is failing in

If it is gitlab runner on docker, just remove compose/stack, pull image and deploy it again

We had this issue.

Upgrading the Runner from Helm chart version 0.32.0 (14.2) to version 0.40.0 (14.10) appears to have fixed it.

1 Like

Solved by restarting gitlab runner (running on version 14.8.0).


The status page has now been updated to show this incident


The status page has been updated for the runner configuration.

Is there expected to be a root certificate that we can use a fingerprint off of to not break OIDC connections with other third party identity providers (say break our configuration in the next 90 days as well)?

executing docker images | grep helper | awk '{ print $3 }' | xargs -r docker rmi
on all the runners maybe helped a bit but it was not conclusive, some jobs continue to fail.

Still getting this error using gitlab runner on kubernetes.

Restarting gitlab-runner fixed the problem for me.

1 Like

We encountered the very same problem, having to reimport the thumbprint several times today to deal with the CA flip-floppping at GL.

Gitlab: If you’re reading, please take this into consideration if you need to change your CA in the future. AWS can host a list of trusted thumbprints, so it should be possible to make this a more graceful process.

@dave.muysson my concern long term with the same issue you are seeing is that the import process will be a frequent one and the solution provided by AWS is to have a lambda function update the thumbprints which still makes an outage happen for an enteprise feature making that not a feature that is used by enterprises.

How to reimport thumprint? I have still problem with:
SSL certificate problem: unable to get local issuer certificate
I have updated from 14.8 to gitlab-runner 14.10.1 (f761588f) and restarted gitlab-runner.service on manager.